As a kid, my favorite book in the world was E.T. Bell’s Men of Mathematics (1937). I must have read it dozens of times by the age of 14. One afternoon, coming home from the library, I could not resist opening the book to a particularly interesting chapter -- and so ended up walking into a parked bus.
With hindsight, certain problems with the book are clear. Bell’s approach to the history of mathematics was exciting, but he achieved that effect, in part, through fictionalization. We now know that embroidering the truth came as second nature to Bell, who was a professor of mathematics at the California Institute of Technology until shortly before his death in 1960. In addition to writing science fiction under a pseudonym, Bell also exercised a certain amount of creativity in telling his own life story – as his biographer, Constance Reid, found out through some detective work.
But another problem with Men of Mathematics only dawned on me recently. I hadn’t thought of the book in ages, but remembered it while reading while reading Letters to a Young Mathematician by Ian Stewart, to be published next month by Basic Books.
The author is a professor of mathematics at the University of Warwick in the U.K. The imaginary recipient of his letters is named Meg -- a nice departure from the longstanding and rather self-reinforcing stereotype of math as a man’s field. The idea that no gender has a monopoly on mathematical talent seems never to have occurred to E.T. Bell. (Nor, consequently, did it cross the mind of a certain young nerd colliding with stalled vehicles during the mid-1970s.)
Fortunately that situation has started to change. And the progress is reflected, in a quiet and matter-of-fact way, in Stewart’s Letters.
A story unfolds, chapter by chapter, as Stewart corresponds with Meg. In the earliest letters, she is still in high school. By the end of the book, she has tenure. It is, in effect, a bildungsroman at one remove. The reader watches over Stewart’s shoulder as the young mathematician turns an early talent into a stable professional identity.
There’s even a moment when, in search of an interesting project to test her abilities, Meg starts trying to find a method for trisecting an angle using only a compass and an unmarked straightedge. This is one of the problems handed down from ancient geometry. People “discover” solutions to this challenge all the time, then become indignant that mathematicians don’t take them seriously. (The proof of why it is impossible involves mathematical tools going way beyond anything available in antiquity.)
But most of the guidance Stewart offers is positive -- and some of it seems useful even for those of us without mathematical aspirations or gifts.
“My usual method for reading a mathematics text,” he recalls about his student days, “was to thumb through it until I spotted something interesting, then work backward until I had tracked down everything I needed to read the interesting bit. I don’t really recommend this to everyone, but it does show that there are alternatives to starting at page 1 and continuing in sequence until you reach page 250.”
The most surprising thing -- at least for anyone influenced by Bell’s romanticized account of the mathematical vocation -- is Stewart’s emphasis on the nuts and bolts of academic life. Letters is full of pointers on academic politics, the benefits and frustrations of collaboration, and how to avoid disaster at conferences. (“Never believe your hosts when they tell you that all the equipment will work perfectly,” he notes. “Always try it yourself before the lecture.”)
E. T. Bell told stories about mathematicians whose lives were shaped, in the final analysis, only by their own creative instincts. They might occasionally win a prize offered by a learned society, or feel driven to some breakthrough by the challenge of defeating a hated rival. But Bell’s men of mathematics were, on the whole, geniuses of the purest vintage. They had inspirations, not resumes. It is hard to imagine anyone trying to give Carl Friedrich Gauss useful career advice.
So does that mean that popularized accounts like Bell’s are something a young mathematician ought to avoid? I contacted Stewart by e-mail to ask his thoughts on the matter.
“I write a lot of books popularising math and science, so I may be biased,” he said in reply, “but when I was in high school I read all the books I could find about the history of math, about mathematicians, and about various topics in math. And those definitely had a significant effect on my interest in the subject. They made it clear that math has a long and fascinating history, that the great mathematicians were real people, not just obsessed geniuses who couldn't tie their own shoelaces, and that there is much, much more to math than the tiny part of the subject that we are all taught at school.”
Well, that’s a relief. There’s something to be said for idealization and hero worship, after all, in their proper season. You then have your whole life to become more realistic, not to say more calculating.
A carefully worded memorandum appearing last week on the Web site of the Church of Jesus Christ of Latter-day Saints informs the public that the expression “Mormon polygamist” is now offensive and inaccurate -– no matter what used to happen, back in the day. The occasion for this official statement is, of course, the new HBO series "Big Love," about a Utah businessman and his three wives living behind a facade of suburban normalcy in Salt Lake City.
Not content with making semantic demands, the memo also ventures into the field of applied cultural studies, calling HBO’s program “essentially lazy and indulgent entertainment that does nothing for our society and will never nourish great minds.” (Insert Osmonds joke here.) The show has also met with criticism from anti-polygamy activists in Utah, who worry that "Big Love" will treat an exploitative practice as just another alternative lifestyle.
Based on a viewing of the first episode, I think some of the complaints are premature. There is a formal statement, before the closing credits, that the LDS policy now forbids polygamy. But more, the narrative will clearly be driven by tensions between “official” Mormonism and the splinter sect presented in the series. And there is already more than a hint of violent menace in the character of Roman, the splinter group’s “prophet,” played by Harry Dean Stanton, whose presence on screen always carries the Gothic aura picked up from appearing in the films of David Lynch.
The prophet’s newest bride is 14 years old. Patriarchal authority has rarely looked this sleazy.
But perhaps there is a complaint to lodge about "Big Love" after all. By default, it perpetuates the common notion that the Mormon polygamy was a unique mutation in the history of Christianity. On the contrary, the practice goes back very nearly to the beginning of the church -- and it has popped up again, from time to time, sometimes finding the most surprising advocates.
The pioneers in this were the Adamites, a sect from the second century. Being redeemed from sin, they held, meant being restored to mankind’s original innocence; hence the Adamites worshiped in the nude and practiced a kind of group marriage. This went over with church authorities about as well as might be expected. But the heresy proved remarkably durable, reemerging in various forms throughout the Middle Ages and on into the Enlightenment.
Another way of looking at it would be to say that the Adamites were the original manifestation of the counterculture. Raoul Vaneigem -- whose theoretical writings were an inspiration to student radicals around the world in 1968 -- hailed this dissident current as an inspiration. His 1994 book The Movement of the Free Spirit, published by Zone Books and distributed by MIT Press, is subtitled “General Considerations and Firsthand Testimony Concerning Some Brief Flowerings of Life in the Middle Ages, the Renaissance and, Incidentally, Our Own Time.”
But Christian departures from the monogamous norm were not limited to the occasional group of proto-hippies. Following Luther’s challenge to the authority of the established church, some early Protestant theologians argued for a return to the example of the ancient Jewish patriarchs as described in scripture. (Strictly speaking, they were advocates of polygyny, marriage to multiple wives. By definition, the term “polygamy” is more inclusive; it would cover the somewhat rarer practice of polyandry, in which a woman has several husbands.)
Luther himself concluded that it might be doctrinally permissible to be married to more than one woman at a time, at least in theory. Some of the more radical reformers in his wake also considered it a practical possibility. A fascinating account of this tendency in early Protestantism appears in After Polygamy Became a Sin: The Social History of Christian Polygamy by John Cairncross, published in 1974 and now out of print.
The author, a journalist and independent scholar, not only wrote history but made a little. Before turning his attention to Reformation theology, Cairncross was part of the Soviet spy ring recruited among students at Cambridge University during the 1930s.
But the biggest surprise of all, I suppose, is the endorsement of polygamy by another famous British subversive -- a poet by the name of John Milton, who, before writing Paradise Lost, was de facto Minister of Propaganda for the Puritan regime that executed Charles I in 1649.
Apart from his poetry, Milton was a pamphlet-writing machine. John Keats referred to his prose as “delectable,” which is, on the whole, a minority opinion, though one I would endorse. There is no greater defense of the freedom of the press than Areopagitica. And the autobiographical passages in some of his tracts were, in their time, an extremely bold departure from the norm. (His polemical opponents respond by accusing him of egomania and bad taste.)
Only in 1823 did an archivist discover the manuscript known as De Doctrina Christiana, a systematic theological work attributed to Milton. (Controversy over whether or not he wrote all of it has never died.) The text contained a number of surprises concerning Milton’s religious beliefs -- some of which it would take an hour to explain for anyone not up on the fine points of Christian theology.
But the argument of one section is plain enough: “Polygamy is allowed by the law of God,” wrote Milton at the end of his analysis of relevant passages from both the Jewish and Christian scriptures. It was practiced by Abraham, Moses, and King David, among others -- “men whose holiness renders them fit patterns for imitation, and who are among the lights of our faith.” And among Christians, it was forbidden only to “the ministers of the church alone, and that not on account of any sinfulness in the practice.” Rather, as Milton argued, having more than one wife would be a distraction from doing their duties. Other than that, polygamy would simply be a form of marriage -- and “marriage is honourable in all, and the bed undefiled.” (The whole section is available online here).
In 1825, the English translation of De Doctrina provided the occasion for a long article in The Edinburgh Review by Thomas Macaulay, who would go on to become the Victorian era’s most prominent essayist. ( This piece, which was much discussed at the time, was actually his debut in a major literary venue.)
Macaulay referred to the section on polygamy only in passing -- but he threw out a line that must have been amusing for readers familiar with the poet’s domestic miseries. In 1642, Milton married a young woman named Mary Powell, who promptly left him. Although they were later reunited, it was a joyless pairing. Milton went on to write a series of pamphlets arguing that there ought to be grounds for divorce besides adultery. In short, total incompatibility should be enough.
As for Milton’s posthumously revealed views on polygamy -- well, Macaulay thought they were cut from the same cloth. “We can scarcely conceive,” wrote Macaulay, “...that any reader, acquainted with the history of his life, ought to be much startled....”
At one level, Macaulay’s comment reflects a “common sense” -- if not terribly sensible -- understanding of why an unhappily married man would prefer polygyny. After all, if things were going badly with one wife, you could turn to another.
On a more serious note, I see that even a scholar who questions whether or not Milton wrote all of De Doctrina believes that the section on polygamy might well reflect Milton's thinking in the matter. Throughout the 1990s, William B. Hunter made perhaps the most exhaustive argument for skepticism, culminating in his book Visitation Unimplor’d: Milton and the Authorship of De Doctrina Christiana, published by Duquesne University Press in 1998. In the course of an almost page-by-page analysis of the original manuscript, Hunter states that “the section on polygamy” and “the pages on divorce” were probably by the same author -- that is, “Milton, in my opinion.”
Well, that’s good enough for me. But assuming that Milton did think polygamy through in this fashion, how much of it was driven by idiosyncratic personal considerations? And how much by the avant garde theological debates of his day? I contacted Michael Bryson, an assistant professor of English at California State University's Northridge campus to pick his brain.
Bryson is particularly interested in the overlap between Milton’s politics and his theology. (And if you have any doubt that marriage is all about politics...) His study The Tyranny of Heaven: Milton’s Rejection of God as King was published by the University of Delaware Press in 2004.
His argument is not quite identical to William Blake’s wily notion of how come Satan gets all the best lines in Paradise Lost -- namely, that Milton “was a true Poet and of the Devil’s party without knowing it.” Instead, Bryson’s thesis is that Milton, while certainly a Christian, challenged the established conception of God. The old image treated the deity as a really, really powerful authority figure. (A human sovereign, on a superhuman scale: a tyrant without limits.) The theology emerging on the other side of this critique is, in effect, rather more libertarian than one might expect from a Puritan.
I asked Bryson how he understood the poet’s thinking on polygamy. “Milton,” he wrote back by email, “was an advocate for throwing off what he saw as man-made restrictions of, or infringements upon, the freedoms that God created mankind with originally.” That, he explained, was a current running throughout his pamphlets defending the English Revolution of the 1640s. It was also “the essential logic of his case for overthrowing canon law regarding marriage and divorce.” And likewise with the defense of polygamy.
“I think,” said Bryson, “that Milton, in general, saw humanity as having been more free in the past (in the days of the patriarchs, in the days before monarchy, in the days of the early Christian church) than they had come to be in his day.”
The effort to recover that lost liberty guided Milton in how he applied the Reformation principle of “sola scriptura,” of arguing strictly from the text of the Bible. In doing so, Milton gave things, “a twist in whatever direction he saw as leading to a lessening of external restrictions on thought and action,” as Bryson puts it. He quotes Milton’s appeal to "the pre-eminent and supreme authority [...] of the Spirit, which is internal, and the individual possession of each man."
And by “each man,” of course, the author meant ... well, ”each man.” The right to variety was, as one says nowadays, a gendered prerogative. (No wild Adamite polyandry on the horizon for John Milton.) The limits to his conception of freedom were the product of his era.
Still, Bryson thinks we should avoid reading too much autobiographical subtext into the poet’s defense of polygamy. “Milton’s ‘life,’” he points out, “was lived largely in the realm of thought. Milton’s physical/domestic life seems to have been much less radical than his theopolitical thought.... Knowing the history of Milton’s domestic life would not necessarily lead us to think: ‘Well, here’s a guy who will probably defend polygamy.’ ”
I suspect, in any case, that he would want to watch "Big Love," because it is interesting to study the practice as well as the theory. Besides, you occasionally feel the hankering for “essentially lazy and indulgent entertainment that does nothing for our society and will never nourish great minds,” as they say in Utah.
Over the past few days, as perhaps you have heard, it has become more or less impossible to get hold of a copy of "Ready to Die" (1994) -- the classic (and prophetically named) debut album by the Notorious B.I.G., a gangster rapper killed in a shooting in 1997.
Well, perhaps "impossible" is overstating things. But expensive, anyway. Secondhand copies of the CD, recently selling for $6 each on Amazon, now fetch $40; and the price is bound only to go up from there. "Ready to Die" was withdrawn last week after a jury found that one of the tracks incorporated an unlicensed sample from a song originally recorded in 1992 by the Ohio Players -- the band best remembered for "Love Roller Coaster," a disco hit of the late 1970s. (Also, for an album cover featuring a naked woman covered in honey.)
Learning about the court case, I was, admittedly, shocked: Who knew the Ohio Players were still around? The Washington Post called them "funk dignitaries." Somehow that honorific phrase conjures an image of them playing gigs for the American Association of Retired Persons. They will be splitting a settlement of $4.2 million with their lawyers, which probably means a few more years on the road for the band.
Apart from that, the whole matter came very close to being what, in the journalistic world, is called a "dog bites man" story -- a piece of news that is not really news at all. Digital technology now makes it very easy for one musician to copy and modify some appealing element from another musician's recording. Now lawyers hover over new records, listening for any legally actionable borrowing. Such cases are usually settled out of court -- for undisclosed, but often enormous, sums. The most remarkable thing about the "Ready to Die" case is that it ever got to trial.
More interesting than the legal-sideshow aspect, I think, is the question of how artists deal with the situation. Imitation, allusion, parody, borrowing stray bits of melody or texture -- all of this is fundamental to creativity. The line between mimicry and transformation is not absolute. And the range of electronic tools now available to musicians makes it blurrier all the time.
Using a laptop computer, it would be possible to recreate the timbre of Jimi Hendrix's guitar from the opening bars of "Voodoo Chile (Slight Return)" in order to color my own, rather less inspired riffs. This might not be a good idea. But neither would it be plagiarism, exactly. It's just an expedited version of the normal process by which the wealth of musical vocabulary gets passed around.
That, at least, would be my best argument if the Hendrix estate were to send a cease-and-desist letter. As it probably would. An absorbing new book by Joanna Demers, Steal This Music: How Intellectual Property Law Affects Musical Creativity,published by the University of Georgia Press, is full of cases of overzealous efforts to protect musical property. Some would count as implausible satire if they hadn't actually happened: There was, for example, the legal action taken to keep children from singing "This Land is Your Land" at summer camp.
Demers, an assistant professor of music history and literature at the University of Southern California, shows how the framework of legal control over music as intellectual property has developed in the United States. It began with copyright for scores, expanded to cover mechanical reproduction (originally, via player-piano rolls), and now includes protection for a famous performer's distinctive qualities as an icon. Today, the art (or whatever it is) of the Elvis impersonator is a regulated activity -- subject to the demands of Elvis Presley Enterprises, Inc., which exercises control over "not only his physical appearance and stage mannerisms but also the very quality of his voice," as Demers notes. "Impersonators who want to exhibit their vocal resemblance to Elvis can legally do so only after paying EPE."
What the King would say about this is anybody's guess. But as Demers reminds us, it probably wouldn't make any difference in any case: It is normally the corporate "content provider," not the artist, who now has discretion in taking legal action. The process of borrowing and modifying (whether of folk music by classical composers or Bootsy Collins bass-lines by hip-hop producers) is intrinsic to making music. But it is now increasingly under the influence of people who never touch an instrument.
It is impressive that so trim a book can give the reader so broad a sense of how musical creativity is being effected by the present intellectual property regime. The author's note indicates that Demers, apart from her academic duties, serves as "a freelance forensic musicologist" -- one of those professional sub-niches that didn't exist until fairly recently. Intrigued by this, I asked her about it.
The term "is definitely over the top," she admits, "but I can't take credit for it. It just refers to music specialists who assess borrowings and appropriations, sometimes in the courtroom but most often before any lawsuits are filed." The American Musicological Society provides a referral list of forensic consultants, which is where potential clients find her.
She's been at it for three years -- a period coinciding with her first full-time academic post. "As far as I know," she says, "I don't get any credit at USC for this type of work. I'm judged pretty much solely on research and teaching plus a bit of committee work. I do have a few colleagues at USC who've also done this sort of work. It's a nice source of extra revenue from time to time, but as far as I know, there are only two or three folks around the world who could survive doing this alone full-time."
Demers is selective about taking on freelance cases. "Some are legit," she says, "while others are sketchy, so I try to be choosy about which cases I'll take on." At one point, she was contacted "by a person who was putting together a lawsuit against a well-known singer/songwriter for plagiarizing one of his songs. His approach was to begin by telling me how serious the theft was, but he wanted me to commit to working for him before showing me the two songs. Needless to say, we ended up not working together. Most cases, though, are preemptive in the sense that producer or label wants to ensure that materials are 'infringement free' before releasing them."
There is an interesting tension -- if not, necessarily, a conflict -- between her scholarship and her forensic work. "The challenge for me in consulting," as Demers puts it, "is that I have to give advice based on what copyright law currently states. I don't agree with many aspects of that law, but my opinion can't get in the way of warning a client that s/he may be committing an actionable infringement."
In reading her book, I was struck by the sense that Demers was also describing something interesting and salutary. All the super-vigilant policing of musical property by corporations seems to have had an unintended consequence -- namely, the consolidation of a digital underground of musicians who ignore the restrictions and just do what they feel they must. The tools for sampling, altering, and otherwise playing with recorded sound get cheaper and easier to use all the time. Likewise, the means for circulating sound files proliferate faster than anyone can monitor.
As a geezer old enough to remember listening to Talking Heads on eight-track tape, I am by no means up to speed on how to plug into such networks. But the very idea of it is appealing. It seems as if the very danger of a cease-and-desist order might become part of the creative process. I asked Demers if she thought that sounded plausible.
"Yes, exactly," she answered. "I don't want to come out and condone breaking the law, because even in circumstances where one could argue that something truly creative is happening, the borrower risks some pretty serious consequences if caught. But yes, this has definitely cemented the distinctions between 'mainstream' and 'underground or independent' in a way that actually bodes better for the underground than the mainstream. Major labels just aren't going to be attractive destinations for new electronica and hip-hop talent if this continues. And if there is a relatively low risk of getting caught, there are always going to be young musicians willing to break the law."
The alternative to guerilla recording and distribution is for musicians to control their own intellectual property -- for one thing, by holding onto their copyrights, though that is usually the first thing you lose by signing with a major label. "What I like to tell undergrads passing through USC," says Demers, "is that the era of mega-millions-earning stars is really coming to a close, and they can't expect to make large sums of money through music. What they should aim to do is not lose money, and there are several clever ways to avoid this, like choosing a label that allows the artist to retain control over the copyrights."
One problem is that artists often lack a sense of their options. "The situation is better than it used to be," Demers says, "but still, most artists are naive about how licensing works. They come with ideas to the studio and then realize that they must take out a loan in order to license their materials. Labels don't license samples; artists do. And if a lawsuit develops, most of the time, the label cuts the artist loose and says, 'It's your problem.' "
There is an alternative, at least for musicians whose work incorporates recontextualized sound fragments from other sources. "The simple way around this," she continues, is for an artist who uses sampling to connect up "the millions (there are that many) who are willing to let their work be sampled cheaply or for free."
But as Steal This Music suggests, the problem runs deeper than the restrictions on "sampladelia." Had the Copyright Term Extension Act (CTEA) of 1998 been enacted 50 years earlier, you have to doubt that anyone would have dared to invent rock and roll. The real burden for correcting the situation, as Demers told me, falls on the public.
"I am pretty confident that content providers will continue to lobby for extending the copyright term," she says, "The CTEA passed because of the pressure that Disney and Time Warner put on Congress, and was abetted by the fact that the public was largely silent. But we're at a different point than we were in the late 1990s, and organizations like Public Knowledge and Creative Commons and scholars like Lawrence Lessig have done a good job of spreading the word about what extending copyrights does to creativity. Next time Congress has a copyright extension bill in front of it, I hope that voters will get busy writing letters."
The decline of Western civilization proceeds apace. One shudders to imagine life in decades hence. A case in point: People now use cell phones in research libraries.
Wandering the stacks, they babble away in a blithe and full-throated manner -– conversing, not with their imaginary friends (as did the occasional library-haunting weirdo of yesteryear) but rather with someone who is evidently named “Dude,” and who might, for all one knows, be roaming elsewhere in the building: an audible menace to all serious thought and scholarly endeavor.
This situation is intolerable. It must not continue. I have given this matter long consideration, and can offer a simple and elegant solution: These people ought to be shot.
I am no extremist, please understand; no gun nut in a rural compound; no wild-eyed advocate of freelance vigilantism. Just a temperate and long-suffering citizen who has heard quite enough about the affairs of Dude for one lifetime.
Max Weber pointed out that one of the hallmarks of modernity is that the state retains a monopoly on the legitimate use of violence. I have no disagreement with that principle. It just seems like time for it to be applied in a new way.
The people who do the shooting ought to be suitably trained, tested, and certified. (Their accuracy as marksmen would be demonstrated beyond all doubt.) A poster at the entrance to the building would give fair warning that no cell-phone conversations are permitted beyond a certain clearly marked boundary line. The consequence of violating this rule could be illustrated with artwork, perhaps involving some easily recognized cartoon character.
Shooting with actual bullets might be excessive. If the budget permits, some kind of taser gun would be appropriate. Failing that, buckshot would probably do the trick.
Admittedly, a rational person could object to my plan. “Wouldn’t shooting cell-phone users in research libraries be counterproductive?” you might well ask. “Wouldn’t that actually make the library more noisy?”
A fair point. Yes, it would. But not for long....
I began pursuing this line of thought under two inspirations. One of them came from reading the conservative British essayist Theodore Dalrymple, who frequently contributes to The New Criterion. A selection of his work appeared last year in Our Culture, What’s Left of It: The Mandarins and the Masses, published by Ivan R. Dee. There is a grand tradition of reactionary cultural criticism. Regarding comprehensive misanthropy as a justified inference from the available evidence about mankind, it turns disgust into a systematic world view. Dalrymple often seems like the most skilled practitioner of this approach now writing in the English language. Many rant; few have his gift for it.
So, in part, I wanted to pay homage. At the same time, Dalrymple comes to mind for a reason. My policy suggestions are the result of long experience and growing frustration. In other words, I want to shoot those people. I really, really do.
Which is not, of course, a socially acceptable emotion. Acting on it is discouraged by law. One understands this, of course; hence the imagined compromise, in which trained personnel would execute the punishment.
Being forced to listen to one side of a manifestly inane conversation is now a routine part of public life. It is tolerable on the street -- but not, somehow, in a library; and in one mostly full of academic tomes maybe least of all. What’s worse, the rot is spreading.
Professors routinely complain about the presence of cell phones in the classroom. But the culpability is not so one-sided as all that.
A friend reports attending a session of a major scholarly conference -- a panel on some grave topic in military history, I think. From the audience came the distinctive noise of a cell phone ringing.
No surprise there, of course. But then its owner pulled out the phone, answered it, and began a conversation.
Here, a line has been crossed. Some implicit rule of conduct (normally unstated, simply because nobody should have to spell it out) has been violated. A fissure in civility has appeared -- and the responsible party deserves to be swallowed up in the abyss so opened.
At very least, that person has lost all reasonable claim to immunity from having a powerful blast of electricity delivered to his or her system by somebody carrying a stun gun and a permit.
Not likely, though. Without being too much a determinist about this, it does seem as if technology, in making certain kinds of behavior possible, also makes it inescapable. That, in turn, results in deep changes in attitude and personality.
A sense of entitlement trumps the capacity for embarrassment. By that point, there’s no going back.
Or is there? For many years now, I’ve been a fan of The Civilizing Process by the late Norbert Elias, a great study in historical sociology that was first published in 1939. In it, Elias worked out an account of how behavior changed in Europe between the middle ages and the early 20th century. He analyzed the evidence from diaries, letters, and etiquette books to see how the rules of everyday conduct developed over time. Things considered acceptable and normal in one century would be regarded with disgust and outrage in another.
Elias found that such changes were not a matter of fashion or whim. Nor were they trivial. The rules governing routine behavior were tied to two long-term processes underway. One was the growing complexity and interdependence of economic life. The other was the concentration of military power in the hands of the state. (We take it for granted now that the army or police are -- or at least should be -- accountable to the political authorities. But this is actually a fairly recent development in human history.)
As these tendencies were taking shape on the macro level, the little rules of daily life were changing accordingly. To keep things running more or less smoothly, each person was expected to internalize certain rules. Things that once happened without anyone noticing them came under increasing scrutiny.
“Do not spit into the basin when you wash your hands,” a medieval text admonished, “but beside it.” In 1714, a French handbook on etiquette suggested that you not spit unless absolutely necessary. In that case, be discreet enough to put your foot on it. (Also: “Do not spit so far that you have to look for the saliva to put your foot on it.”) By 1859, a British author noted that spitting was not just disgusting “but very bad for the health” -- so you should never do it, period.
A similar change could be traced in discussions of flatulence. In 1530, the very learned Erasmus of Rotterdam noted: “If it can be purged without noise that is best. But it is better that it be emitted without much noise than that it be held back.” If necessary, he said, you should cough simultaneously to avoid embarrassment. (My wife, who gave me The Civilizing Process as a birthday present some years back, would probably rather I not cite Erasmus so much.) By 1729, a French rulebook warned that the release of gas “is very impolite ... either from above or from below, even if it is done without noise.”
Over the course of two or three hundred years, then, the expectation grew that each individual would practice more and more self-regulation. Social life, as Elias puts it, came to resemble a modern highway: “Every individual is himself regulating his behavior with the utmost exactitude in accordance with the necessities of this network. The chief danger that people here represent for others results from someone in this bustle losing his self-control.”
It is the analysis of table manners that most closely anticipates the present cell-phone problem. Originally, the use of knives and forks was restricted to very elite members of the aristocracy. At first, even some of them found it pretentious and affected. (Here, one thinks of the portable phones of the 1980s, which were nearly as big as your head, and seemed mainly to be used by hotshot lawyers and stockbrokers trying to broadcast how very important they were.)
As the use of eating utensils spread, various rules emerged. “Do not clean your teeth with your knife,” the advice books often warned. That is a pretty good indication that lots of people were cleaning their teeth with their knives, since you don’t have to forbid something nobody actually does.
But Elias also notes something even more interesting. The knife, while a useful tool at the dinner table, was also potentially a dangerous instrument of aggression. The very sight of it may have provoked a fear that it would inspire hostility -- or that, if you mishandled it, you might carelessly hurt somebody else.
So the pressure grew discouraging people from using knives at the dinner table for any but a very few functions. If a piece of food can be cut with the edge of a fork (the rule goes) you should do so. By no means stab a hunk of steak with your knife and eat it. Etc.
“There is a tendency that slowly permeates civilized society, from top to bottom,” writes Elias, “to restrict the use of the knife ... and wherever possible not to use it at all.”
The cell phone, then, is a little like a fart, and a lot like a knife. In the most optimistic scenario, people will learn to control their behavior over time. Civility will be restored. It should take about two centuries. I figure three, tops.
If you order a DVD called “Bettie Page: Bondage Queen,” Amazon will make some reasonable, though nonetheless startling, guesses about other items you might enjoy. (So one quickly discovers.) But the online retailer’s algorithms aren’t quite finely tuned enough to account for the fascination that Bettie Page exercises. Whether posing in calender-girl mode, or wielding a whip in the somewhat paradoxical role of a cheerful dominatrix, she returns the viewer’s gaze in a way that challenges one’s stereotypes about the sexually repressive 1950s. She also represents, in my opinion, a definitive refutation of the American media's inexplicable erotic valorization of the blonde.
Her story is coming to the screen this week in “The Notorious Bettie Page,” written and directed by Mary Harron, whose film about Valerie Solanas, “I Shot Andy Warhol,” was an exceptionally smart and insightful biopic. But Harron isn’t the only contemporary feminist interested in Page -- or in the combustible mixture of sexist ideology and female agency captured in vintage erotica.
In Pin-Up Grrrls: Feminism, Sexuality, Popular Culture,forthcoming this summer from Duke University Press, Maria Elena Buszek, an assistant professor of art history at the Kansas City Art Institute, describes the mutations of the pin-up genre over the decades. It is a cultural history that crisscrosses with the succeeding “waves” of feminist activism.
The pictures of actresses that became popular in the 19th century marked the emergence of a new kind of “public woman.” (In the earlier sense of that term, it suggested, not the female equivalent of a public man, but prostitution: the sexual equivalent of a public convenience.) With the consolidation of the film industry’s role as arbiter of glamour and lifestyle possibility, the variety and quantity of pin-up imagery grew. One familiar response to all of this --- the attitude routinely stereotyped as “feminist” -- was to denounce the entire phenomenon as “male objectification.” But women formed part of the audience for pin-ups. The range of posture and demeanor captured in the images reflect the increasing options for self-assertion, libidinal and otherwise, explored by women.
A thumbnail sketch of its analysis can’t do justice to the book. It includes dozens of images from the history of the pin-up -- from the naïvely stagy publicity photos of the 1860s to the ironically stagy meta-pin-ups created by contemporary pomo artists. An excerpt from the book is available at Buszek’s Web site. I recently interviewed her about her work. The notorious Bettie Page has only a small part in the history that Buszek has reconstructed. I asked about her anyway. (It meant that watching those short films on DVD counted as research.)
Q: How did you settle on this as a topic for research? The images themselves are fascinating, of course. But there's a difference between that level of interest and the kind involved in investing so much time and energy in a subject.
A: Well, to be honest, this was a project that really originated in artists' studios. My Ph.D. is in contemporary art, and Pin-up Grrrls began as my dissertation, where I was trying to figure out the phenomenon of younger feminist artists gravitating toward pin-up imagery. Since I began my B.A. in 1989, I had noticed more and more pin-ups appropriated by young women -- not just in their gallery art, but in more street-level ways, in t-shirts and Riot Grrrl 'zines -- as icons of feminism. Not "femininity," but feminism.
My first instinct was to assume that this was a way for young women to take an image type that older feminists had held up as a symbol of women's sexual servitude, something antifeminist and -- in a typically postmodern gesture -- reappropriate it as a symbol of strength and sexual power for a new generation. And, considering how polarizing the "sex wars" of the 1980s were -- where the position of anti-pornography activists like Andrea Dworkin and Catharine MacKinnon was tremendously influential, and put forward by the mass media as "the" voice of feminism -- it seemed to make sense that this might be the first front on which emerging feminists might try to identify themselves as something different.
However, in my efforts to figure out what era or "type" of feminism this strategy was working to challenge, I quickly discovered that the pin-up was used by women since its very origins in the 19th century to mark a range of activist positions in the women's movement -- but all of them asserting that women's sexual expression deserves a significant role in the dialogues around women's sexual oppression. In retrospect this should have been a no-brainer, considering that (if I can paraphrase Carole Vance) feminism since Mary Wollstonecraft hasn't just been about decreasing women's pain and misery, but also increasing their joy and pleasure.
But (like most young feminists of my generation), I had kind of unconsciously swallowed the mass media myth that feminists of the women's liberation movement, or "second wave" of feminist history, were these angry, dogmatic asexuals -- and, naturally, with each decade I went back in my research, reading the actual texts of feminism's evolution, I found that the opposite was true.
I also discovered what was also true of feminism's long history is that these voices wanting to stress sexual self-expression as a feminist issue were usually those of younger women in the movement, and that their perspectives were generally dismissed by both older feminists -- who themselves were often "over" the whole sex issue, and had moved on to less dicey issues -- and the period's feminist organizations, which were almost always run by these same older, experienced feminists. So, the younger women turned to popular youth culture for places where they could "see" their ideals represented -- so that the way young women today hold up pop icons like Gwen Stefani or Coop's "devil girl" illustrations as symbols of "their" feminism, young feminists at the turn of the century used Sarah Bernhardt and Charles Dana Gibson's "Gibson Girl" illustrations as icons of their own.
And all this brings me back to your question: moving from recognition of a scholarly subject to following through on researching it. Naturally, my first motivating factor was the old-fashioned awareness that no one had documented this long, "secret" history of the pin-up before, so I had the rare ability to scoop my colleagues on this terrific story. But ultimately what made me stick with it -- from a 300-odd-page dissertation to a 600-odd-page manuscript -- was the responsibility I felt to illuminating, and ideally ending the vicious cycle of generational misunderstandings that have plagued the women's movement since its start. Each generation has consistently held itself up as the "next wave" of feminism, and then the minute that they organize and gain a certain amount of power proceed to both selectively address what came before them and try to suppress efforts at change by the generation that follows. I felt that if I perhaps took the longest view possible of a subject and image that has continuously divided feminists, perhaps I could help suggest some common ground -- not just in the fact that we "commonly" fight one another, but also the fact that, try though we might, sex just will not go away, especially if the movement keeps needing young women -- who have always been not only sexually preyed upon, but sexually curious and active -- to keep it going!
Q: You refer to stereotype of an angry, dogmatic, and anti-sexual feminism -- and you dismiss this idea, or treat it as a cliché. Well, yes and no .... About 20 years ago, I was part of a left-wing and very pro-feminist newspaper staff that was called upon in a "community meeting" to do self-criticism for some incredibly subtle crypto-patriarchal gesture or other. The whole experience was very strange. (It might have been traumatic, had we not all been so heavily sedated.) Anyway, you do realize that your book would have been bitterly denounced at one point in the not-so-distant past, right? There was often a sectarian rancor (a purist if not puritan quality) to some activist feminism very different from the pluralism of some academic varieties.
A: Oh, yes! Of COURSE! In my book, I certainly don't sidestep the fact that there is still a significant percentage of feminist thinkers who question whether any woman's sexuality under patriarchy can ever be truly under their own control. Indeed, I have no doubt that my book will be denounced in certain circles for that very reason. I also agree that younger feminists in our third wave of feminism -- which, by the way, I argue is a way to periodize our era rather than a generational label -- tend to be more sex-positive and look to feminist history for reflections of their own sensibilities. My book is an example of this!
However, young feminists in the second wave (and all generations that preceded them) were the exact same way. The fact is that if you go back to the very foundational texts that today's most sex-suspicious feminists are drawing upon -- Kate Millet's Sexual Politics, Andrea Dworkin's Woman Hating, Shulamith Firestone's The Dialectic of Sex -- these works are openly calling for, and believing in, the possibility of a feminist sexual revolution to go with the political changes they are demanding. (To say nothing of Germaine Greer and Erica Jong.)
They specifically began writing their own "herstories" not just to document the movement in their own words, but to cherry-pick their predecessors -- a fact that Astrid Henry's book Not My Mother's Sister does a very good job of addressing. And by the time we got to the late 1970s and early 1980s, this fact was conveniently forgotten as many of these same authors began calling for more radical and less sex-positive definitions of feminism -- themselves sweeping their own earlier texts under the rug in the process.
This is exactly what I'm talking about when I talk about the "vicious cycle" of selective memory as each generation of the women's movement evolves. One can argue that the exact same thing was frequently true of feminist leaders from Elizabeth Cady Stanton to Betty Freidan.
And, as far as the grass roots of feminist activism, I have to disagree with you. I think that the activists "in the trenches" have always been the ones used to thinking realistically, negotiating for change. As Dorothy Allison has said, you can't be out there in the real feminist world without meeting the occasional African-American-ex-military-Republican-to-hell-with-NOW-lesbian feminist alongside the garden-variety Lefty-WASP-college-professor types -- largely because these are oftentimes the more vocal folks at the meeting. Pin-up Grrrls came out of this culture when I realized that there were distinct differences between what the women on the streets (and the bars and the clubs) living feminism and those writing about and teaching it had to say about how feminism was defined. I wanted to shore up all the ideas that these two groups might have in common, often without knowing it, and try to bridge this gap -- in large part because I myself was one of those caught in the middle.
I'm a working-class, Hispanic-American, Roman Catholic punk -- I wasn't supposed to go to college, much less become a feminist scholar. My scholarship has basically documented my journey to finding out why I did. I mean, I've been calling myself "feminist" -- much to the consternation of my parents -- since I was about nine years old, and I understood this term relating to "Charlie's Angels" on TV and the anti-nukes nuns at my school long before I knew who either Andrea Dworkin or Dorothy Allison were.
Q: Bettie Page is now something like the embodiment of the pin-up girl, the figure who normally comes to mind when the pin-up is mentioned. (The punning overtones of "embodiment" and "figure" probably can't be helped.) But unlike most of the earlier figures, she wasn't an actress or public figure of note before her image became known. The short films she made came later. How does she fit into the history of the form? How did she manage to become both anomalous and an archetype?
A: Yes, Bettie was one of the first pin-up icons famous for her pin-up work alone, rather than using pin-ups as a kind of necessary promotional tool to draw attention to something else that she did. And I think part of her success was that she looked at these pin-ups as her acting career -- she had a disastrous screen test, and a working-class Southern accent, and couldn't break into movies to save her life -- and clearly poured all of her love of the theatrical into these images and, later, Irving and Paula Klaw's film reels.
And I think that both this love and sense of make-believe are why she would go on to become so iconic and so groundbreaking a pin-up. The fact that she "performed" her typical cheesecake images with such hammy gusto wasn't new -- you see this approach in Hollywood pin-ups from the 1910's on. However, it was the fact that she performed in this same over-the-top, comedic style in her bondage images -- whether she was performing as a dominant or submissive -- in such a way that underscored the playful and performative potential of this seemingly perverse of shameful sexuality that made her unique. I'm making the argument that this wasn't just a radically new way of representing this particular sexual subculture, but more broadly that this was a radically new way of representing sexual womanhood -- particularly since Page was so popular in the 1950s, and beloved not for just one sexual stereotype, but the range of fairly extreme sexualities that she put out there. In the age where one could pretty much be the overtly sexual naif, like Marilyn Monroe, or the eternal virgin, like Doris Day, this was pretty unusual.
Q: In addition to the standard cheesecake shots of Page (and far more memorable, in a lot of ways) are the bondage and fetish images. Does it make sense to include these in the category of "pin-ups"? Or are we talking about something else? They certainly are striking. You've got all these signifiers of decadence and solemn perversity -- and in the middle of it, there's Bettie Page with this easy going, happy look on her face.
A: I definitely include the fetish images as pin-ups ... if you look at them, even from the perspective of the 1950s, they are! The brother-and-sister photographers Irving and Paula Klaw took pride in the fact that they ran a "clean studio," and definitely approached even the B/D/S/M photos--which, by the way, were usually made-to-order for customers -- as just another theatrical corner of their cheesecake business. A different kind of "glamour" photography, really.
The women had to wear two pairs of underwear, just to make sure they were properly covered up, and there wasn't even the suggestion of a sexual act in any of them. So if you go back and look at these images, the poses and situations might seem extreme, but these women aren't anywhere near naked or having sex. The Klaws were very careful that their images fit the social standards for what distinguished a "pin-up" from "pornography," and the amount of skin the subjects showed was kept to a minimum. What got them into trouble was that the same society that was keeping watch over how much skin was exposed was naturally appalled that the scenarios led back to a sexual subculture in an era where subcultures were held in great suspicion.
And, yes, in retrospect what seems crazy is how "threatening" these images were to the American government -- which subpoenaed both the Klaws and Bettie in the federal 1955 hearings on juvenile delinquency, regardless of the fact that the FBI ruled their bondage images weren't obscene by any legal definition. Largely because of how chaste the images truly are, but also because of how silly Page's performances are. But part of why they are still so sexy -- and perhaps what was so threatening -- is how unfazed she is by the supposedly transgressive behavior in which she's participating; she's clearly enjoying herself, and not taking it too seriously. And I think that this pairing of pleasure and play is part of why her images aren't just so popular today, but also such a favorite of young feminists.
She seems to be breaking out of her period's expectations for women like her -- even if she didn't in her personal life, the images she created suggest otherwise in that she's flouting convention, even if just for that fleeting moment when the cameras snapped and the fantasyland of the studio seemed real. Indeed, this idea of the studio as a site where fantasies could be realized -- and where the pin-up could be performed as a theatrical construct of a woman's sexuality -- would be recognized and exploited by the second wave of feminism that followed in the 1960s.
Q: You've probably looked at more pin-up images than anyone. At some point in the research, I assume you learned to look at them in an abstracting and historicizing way. Leaving that aside for a moment: Of the various images you've inspected, which ones really fascinate or appeal to you? And why?
A: Well I naturally have favorites in every period, and from day to day those favorites rearrange themselves. However, I suppose the images to which I keep returning are those created by women with a real sense of ambivalence -- clearly feminist images that are sometimes resigned to the inevitability of complexity and contradiction when it comes to women's sexuality.
My discussion of Frances Benjamin Johnston in Pin-up Grrrls immediately comes to mind; she's known today primarily as either an early photojournalist or society portraitist in the late 19th and early 20th centuries, but she created some really compelling pin-up-style images of the "New Women" of her day -- like Ida Tarbell and Alice Roosevelt -- where we find all kinds of contradictory messages about what a woman is, and what a feminist could be, battling one another.
She also created some practically unknown self-portraiture; not just traditional pin-ups, but also drag self-portraits in a pin-up style, where we find this bisexual, independent, unconventional woman -- who refused to identify with either the period's burgeoning lesbian or suffrage communities -- working all the contradictions of her own identity into this range of images.
Jumping forward to the present, I think that Cindy Sherman is extremely good at this tension, which I discuss at length in the book as well. Recently, one of my favorite contemporary photographers, Collier Schorr -- who, by the way, creates some of the best male pin-ups around -- recently wrote about how interesting it is that, while Sherman's feminism was challenged, sometimes angrily, in the 1980s as Barbara Kruger's work was held up as "correct," that today Sherman's often mournfully ambivalent self-portraits seem so political while Kruger's imagery is licensed out to the very commercial culture that the images were supposed to disdain.
Over all I am fascinated by pin-ups that acknowledge how hard it is to fight for making "the personal the political" when the personal is so wrought with contradictions -- yet demand that feminism take our personal contradictions into account.
Thursday was a long day -- one spent with my brain marinating in historiography. I passed the morning with a stack of JSTOR printouts about Richard Hofstadter, whose The American Political Tradition (1948) still sells about 10,000 copies a year. Hofstadter died in 1970. Enough time has passed for his reputation to have been overthrown, restored, and overthrown again. (As someone who grew up listening to theories about the JFK assassination on talk radio in Texas, I can take anti-Hofstadter revisionism seriously only up to a certain point. The man who wrote a book diagnosing The Paranoid Style in American Politics seems like a strong candidate for immortality.)
Only now is there a full-length treatment of his life, Richard Hofstadter: An Intellectual Biography by David S. Brown, just published by the University of Chicago Press. Asked by a magazine to review it, I have been going over the footnotes and making a long march through the secondary literature. Which is easier than writing, of course, and a lot more fun -- the kind of serious-minded procrastination that requires hours. It sure ate up the morning.
Then, after lunch, I headed over the Washington Hilton to pick up press credentials for the annual convention of the Organization of American Historians. Tens of thousands of bloodthirsty jihadist-commie professors are infesting the nation’s campuses, as you have probably been reading of late -- with the historians being a particularly vile lot, turning almost the entire discipline into one big Orwellian indoctrination camp. “Now this,” I thought, “I gotta see.”
Going over the program, it was particularly interesting to notice a session called “The Creation of the Christian Right.” If the rumors were even half true, it would be one long rant against the Bush administration. Each paper would (to renew the Orwell bit) provide the standard Fifteen Minutes Hate, right?
Maybe that should be Twenty Minutes. Who ever keeps within time limits?
Actually, no. Everybody was calm and nobody ran over. The first paper looked at how Protestant and Roman Catholic social conservatives overcame their mutual distrust during the 1950s. Another analyzed the relationship between Billy Graham and Richard Nixon. The third and final presentation argued that the anti-abortion movement played a very minor role in defining the conservative agenda until it got a plank on the GOP’s platform in 1976. (That same year, when Betty Ford told a New York Times reporter that she considered the Roe v. Wade decision to be a fine thing, her comment appeared in the 20th paragraph of an article appearing on page 16 of The New York Times. A First Lady from the GOP making that statement anytime since then would have gotten a little more attention.)
Each presentation was the work of someone who had done substantial work among primary sources, including archival material. The researchers were alert to how the different factions and constituencies of the conservative movement interacted with one another.
But fervor, condemnation, editorializing by proxy? Not a bit of it.
For that matter, you couldn’t even hear the sort of ironic disdain that Hofstadter, writing decades ago, brought to analyzing McCarthyism or the Goldwater campaign. That tone had reflected the Mencken-esque judgement that American conservatism was just another manifestation of boobery and yahooism.
It was puzzling. If ever a session seemed likely to provide a concentrated dose of jihadist-commie propaganda, it would be one called “The Creation of the Christian Right.” Chances are, the young scholars giving papers did have political opinions. But they did not use the podium as a soapbox.
I guess they had been brainwashed by the OAH into practicing the most disinterested, rigorous sort of professional historical inquiry. Apart from being dangerous, those professors sure are sneaky. You’d almost think they were trying to make somebody look like a boob and a yahoo.
Later, another panel discussed the history of the idea of "the liberal establishment." Once again, I went expecting a strident call to destroy the Great Satan of the American Empire. And once again, it was all careful research and calm reason -- despite the fact that the scholar invited to respond to the papers was Michael Kazin, who had even made Horowitz’s list.
Between sessions, there was time to visit the exhibit hall. It was a chance to gaze upon recent offerings from the university presses. All the while, a small but very persistent voice whispered in my ear. “You don’t need more books,” the voice said. “Where would you put them?”
It sounded a lot like my wife.
Other conference-goers were wandering aisles, men and women of all ages; and some bore expressions suggesting that they, too, received similar wireless transmission from significant others back home. And yet those people picked up the new books, even so. I took courage from their example.
That evening, at a Chinese restaurant a few blocks downhill, I joined a group of convention-goers, most of them associated with Cliopatria, the group blog published by the History News Network. The gathering was all "off the record" -- an occasion for conviviality, rather than for news-gathering. But the relaxed flow of the proceedings took an odd turn around the time the main course arrived.
That was when someone indicated that it might be time for historians to work on a topic that I know rather well -- that, indeed, I had witnessed and to some degree participated in. And that was the late and much-lamented magazine Lingua Franca, the subtitle of which called it “The Review of Academic Life.”
That day, on the Web site of The New York Observer, there had appeared an essay on LF by Ron Rosenbaum -- the author of, among other things, a brilliant and unnerving book called Explaining Hitler.
In his piece, Rosenbaum lauded the magazine as a place that did not merely report on university life, but encouraged "thinking about the nature of human nature and human society, the nature of the cosmos, the nature of the mind itself (thinking about factors that underlie all politics)." Similar tributes were being offered around the table as the dishes were delivered. Somebody compared LF to Partisan Review. One historian suggested that it was time for a monograph.
Meanwhile I chewed my tongue quietly. Between 1995 and 2001, I had been a regular contributor to the magazine. Not that many publications with large audiences would let you write about the literary criticism of Northrop Frye, the philosophical architectonics of Richard McKeon, or the strange little pamphlet that Immanuel Kant wrote about the mystical visions of Emmanuel Swedenborg. Even fewer would then pay you. Now it molders in “the elephants’ graveyard of dead magazines.”
Elephants are supposed to have powerful memories, of course. Now it seems to be time for the historians of journalism to do the remembering. But when I look back at that period, it’s not to recall the glory days. There are too many recollections of botched opportunities and missed deadlines, and the occasional wince-inducing editorial decision. A few droplets of bad blood are sprayed across the sepia-toned mental snapshots. If I tried to write about LF, the result would probably be a satirical novel instead of a eulogy.
It might sound vaguely flattering to imagine that part of one’s own experience will probably, sooner or later, be studied by intelligent people. But in fact it is a little disconcerting.
Scholars will notice aspects of the past that you did not. There will be things charged with indelible personal significance for you that nobody else will recognize. It is hard not to cling to those nuances. To assume that you have a privileged relationship to the past, simply by virtue of having been there. But that’s not how history works.
No, the right attitude is probably the one cultivated by Richard Hofstadter. He was a master at grasping the paradoxes defining his discipline. Few writers have better captured the gap between what people in the past [ital]thought[ital] they were doing, on the one hand, and what their actions actually meant, on the other.
Hofstadter once cited a passage from Nietzsche that summed up his own outlook. “Objection, evasion, joyous distrust, and love of irony are signs of health,” the quotation ran. “Everything absolute belongs to pathology.” It’s worth keeping in mind when in thinking about the private history called memory -- not to mention the yet-unwritten history whizzing past, every hour of every day.
A young Web designer named Aaron Swartz has now created a mirror of the long-defunct Lingua Franca Web site.
For a considerably less impressionistic account of the convention, check out Rick Shenkman’s fine roundup of OAH.
I am a digitally-enabled, network-ready scholar. I check e-mail and browse the Web. I read RSS feeds. I leverage Web 2.0's ambient findability to implement AJAX-based tagsonomy-focused long-tail wiki content alerting via preprint open-access e-archives with social networking services. I am so enthusiastic about digital scholarship that about a year ago I published a piece in my scholarly association's newsletter advocating that we incorporate it into our publications program. The piece was pretty widely read. At annual meetings I had colleagues tell me that they really like it and are interested in digital scholarship but they still (and presumably unlike me) enjoy reading actually physical books. This always surprised me because I love books too, and it never occurred to me that an interest in digital scholarship meant turning your back on paper. So just to set the record straight, I would like to state in this (admittedly Web-only) public forum that I have a deep and abiding passion for paper: I love it. Love it.
It's true that there is a lot of stuff you can do with PDFs and the Web that you can’t do with paper, but too often people take this to mean that digital resources "have features" or "are usable" while paper is just, you know, paper. But this is not correct -- paper (like any information technology) has its own unique form of usability just as digital resources have theirs. Our current students are unused to paper and attribute the frustration they feel when they use it as a mere lack of usability when in fact they simply haven't figured out how it works. Older scholars, meanwhile, tend to forget about paper’s unique utility because using it has simply become second nature to them.
Some of the features of paper are well known: Reading more than three pages of text on a screen makes your eyes bleed, but I can read paper for hours. You can underline, highlight, and annotate paper in a way that is still impossible with Web pages. And, of course, in the anarchy after The Big Electromagnetic Pulse the PDFs will be wiped clean off my hard drive but I will still be able to barter my hard copy of Durkheim's Elementary Forms of the Religious Life for food and bullets.
But my passion for paper is about more than preserving the sociological canon in a post-apocalyptic future. Using paper is embodied in a way that using digital resources are not. Paper has a corporeality that digital texts do not. For instance, have you ever tried to find a quote in a book and been unable to remember whether it was on the left or right hand side of the page? This just a trivial example of way in which paper’s physicality is the origin of its utility.
And of course professors have bodies too. This is another way that scholarship is embodied -- we often do it while in libraries. Here our bodies are literally in a vast assemblage of paper with its own unique form of usability. And as scholars achieve total communion with the stacks, they find books based not just on catalog number, but on all of their senses. The fourth floor of the library I wrote my Ph.D. in sounded and smelled differently than the second did. How many of us -- even the lab scientists -- with Ph.D.'s will ever be able to forget the physical layout of the libraries where we wrote our dissertations? Or our undergraduate libraries? I find books in my current library by comparing its floorplan with the layout of the college library where I first studied.
And catalog systems! I am a DU740.42 man myself, although I freelance in B2430 at times and of course retain a broader competence in G and GN. I was visiting a colleague at Duke once and went into its library to see what sort of GN treasures it might have stored away only to find that the library used Dewey Decimal -- a fact I experienced with surprisingly raw sense of betrayal.
The very fact that libraries can’t buy every book is a form of utility, not a disadvantage. True, there is tons of hubub about Web sites that provide users "personalized recommendations" based on their preferences and the preferences of people in their social networks. But in practice all this has boiled down to the fact that after years of using Amazon.com, it has finally figured out that since I enjoyed reading Plato's Republic, I might also be interested in Homer's Iliad. But every book in my library has been "filtered" by my librarian, and browsing through stacks arranged by subject allows "discovery" of "resources" in a non-metaphorical pre-Internet way.
At Reed, where I went to college, the library had a disused, musty room dubbed the "multiple copy room." Not surprisingly, it was where all the multiple copies of books were stored. The librarians at a small liberal arts college like mine did not buy 10 copies of a book unless they sure that it was a keeper, worthy of being taught for eons, its wisdom instilled into countless generations of students who would value it so much that they would weep when bartering their own copies of it for food and bullets after The Big Electromagnetic Pulse. Browsing through and reading from those shelves was the best "filter" for "content" that I ever had. So much for "the long tail."
And of course browsing doesn't just happen in libraries. Amazon may have a bintillion books for sale out in the ether of the ethernet, but there is no better place to take the pulse of academic publishing that a good used book store near a university. Bookstores mark the life cycle and disposition of the community where they are physically located -- the end-of-the year glut of books dumped by students eager to rid themselves of dead weight like Anna Karenina in order to spend more time tinkering with their MySpace page is itself a good indicator of what a university has been assigning.
Bookstores also connect us to the larger scholarly community. Remainders -- books that are being sold at discount prices because publishers want them out of their warehouses -- are a remarkable measure of what fads have just passed in scholarly publishing or what is about to come out in paperback. And of course just being in a good bookshop can be therapeutic. A good friend of mine worked his way through college at a Walden Books. After work he would spend a half hour in the aisles of our local used book store, staring at the covers of Calvino novels until he had recovered from eight hours of selling people copies of The Celestine Prophecy.
The used book store is the horizon at which our human finitude and our books intersect. I have actually been turned on to the work of scholars based solely on the fact that I've purchased so many books from their collections. One book store I frequent actually put a picture of one recently deceased professor in the window to advertise that his library was on sale. Some find the practice morbid, but for me this sort of thing is the academic equivalent of the life-affirming musical number in The Lion King about how we are all part of the circle of life. Roscher and Knies costs $180 off the Internet and is scarcer than hen's teeth, but in that magical, electric moment that I found it used for 20 bucks I knew that in cherishing and loving it I would not only be honoring the memory of the previous owner, but perpetuating the hopelessly over-specialized intellectual lineage which we both cared about so deeply.
What I am trying to say is that owning and reading books is about our lives as scholars in a way that e-journals are not. Our libraries are furniture. They are decoration. They threaten the breathable air to paper ratio in our apartments and offices. Books spill over my shelves. They crowd my kitchen table. We are what we read. On my bedside I currently have one Hawaiian language textbook, Dan Simmon's science fiction novel Hyperion, Jonathan Lamb's Preserving the Self In The South Seas: 1680-1840, Eugene Genovese's Roll Jordan Roll and Jean-Luc Nancy's The Inoperable Community. In this combination I find elemental solace.
Our collections of physical, paper texts do not only help explain who we are to ourselves, they signal this to our visitors. When my guests first enter my apartment and make a beeline to my shelves they are actually learning more about me. When they admire my copy of Roscher and Knies I am learning something about them. When they spot my first edition of Ricky Jay's Cards as Weapons or Scatological Rites Of All Nations I know that I have found a true soul mate. I am convinced that this is somehow more important than finding out that the professor in the office next to me reads the same cat blogs that I do.
It is easy to see that paper will continue to be used by academics for a long time to come purely on the basis of its utility as an information technology. But we are not passionate about paper because it is a good research tool. We are passionate about it because of the way that it smells and feels. Our love of paper springs from the way it insinuates itself into not only our career, but our souls. This is why, after The Big Electromagnet Pulse, I won't be working desperately on some computer somewhere trying to resurrect my metadata. I’ll be fortifying the multiple copy room and trying to figure out how few copies of The Andaman Islanders I’ll have part with to keep alive until someone manages to turn the power back on.
Alex Golub finished his dissertation in anthropology at the University of Chicago in 2005 and is now an adjunct professor at the University of Hawaii at Manoa. He blogs at Savage Minds, a group blog about cultural anthropology.
The hurried patron spying Why Truth Matters (Continuum) on the new arrivals shelf of a library may assume that it is yet another denunciation of the Republicans. New books defending the “reality-based community” are already thick on the ground – and the publishers' fall catalogs swarm with fresh contributions to the cause. Last month, at BookExpo America ( the annual trade show for the publishing industry), I saw an especially economical new contribution to the genre: a volume attributed to G.W. Bush under the title Whoops, I Was Wrong. The pages were completely blank.
Such books change nobody’s mind, of course. The market for them is purely a function of how much enthusiasm the choir feels for the sermon being addressed to it. As it turns out, Why Truth Matters has nothing to do with the G.O.P., and everything to do with what is sometimes called the postmodern academic left -– home to cross dressing Nietzschean dupes to the Sokal hoax.
Or so one gathers from the muttering of various shell-shocked Culture Warriors. Like screeds against the neocons, the diatribes contra pomo now tend to be light on data, and heavy on the indignation. (The choir does love indignation.)
Fortunately, Why Truth Matters by Ophelia Benson and Jeremy Stangroom, is something different. As polemics go, it is short and adequately pugnacious. Yet the authors do not paint their target with too broad a brush. At heart, they are old-fashioned logical empiricists -– or, perhaps, followers of Samuel Johnson, who, upon hearing of Bishop Berkeley’s contention that the objective world does not exist, refuted the argument by kicking a rock. Still, Benson and Stangroom do recognize that there are numerous varieties of contemporary suspicion regarding the concept of truth.
They bend over backwards in search of every plausible good intention behind postmodern epistemic skepticism. And then they kick the rock.
The authors run a Web site of news and commentary, Butterflies and Wheels. And both are editors of The Philosophers’ Magazine,a quarterly journal. In the spirit of full disclosure, it bears mentioning that I write a column for the latter publication.
A fact in no way disposing me, however, to overlook a striking gap in the book’s otherwise excellent index: The lack of any entry for “truth, definition of.” Contacting Ophelia Benson recently for an e-mail interview, that seemed like the place to start.
Q: What is truth? Is there more than one kind? If not, why not?
A: I'll just refer you to jesting Pilate, and let it go at that!
Q: Well, the gripe about jesting Pilate is that "he would not stay for the answer." Whereas I am actually going to stick around and press the point. Your book pays tribute to the human capacity for finding truth, and warns against cultural forces tending to undermine or destroy it. So what's the bottom-line criterion you have in mind for defining truth?
A: It all depends, as pedants always say, on what you mean by "truth." Sure, in a sense, there is more than one kind. There is emotional truth, for instance, which is ungainsayable and rather pointless to dispute. It is also possible and not necessarily silly to talk about somewhat fuzzy-bordered kinds such as literary truth, aesthetic truth, the truth of experience, and the like.
The kind of truth we are concerned with in the book is the fairly workaday, empirical variety that is (or should be) the goal of academic disciplines such as history and the sciences. We are concerned with pretty routine sorts of factual claim that can be either supported or rejected on the basis of evidence, and with arguments that cast doubt on that very way of proceeding.
Q: Is anybody really making a serious dent in this notion of truth? You hear all the time that the universities are full of postmodernists who think that scientific knowledge is just a Eurocentric fad, and therefore people could flap their wings and fly to the moon if they wanted. And yet you never actually see anyone doing that. At least I haven't, and I go to MLA every year.
A: Of course, there is no shortage of wild claims about what people get up to in universities. Such things make good column fodder, good talk show fodder, good gossip fodder, not to mention another round of the ever-popular game of "Who's Most Anti-Intellectual?" But there are people making some serious dents in this notion of truth and of scientific knowledge, yes. That's essentially the subject matter of Why Truth Matters: the specifics of what claims are being made, in what disciplines, using what arguments.
There are people who argue seriously that, as Sandra Harding puts it, the idea that scientific "knowers" are in principle interchangeable means that "white, Western, economically privileged, heterosexual men can produce knowledge at least as good as anyone else can" and that this appears to be an antidemocratic consequence. Harding's books are still, despite much criticism, widely assigned. There are social constructionists in sociology and philosophy of science who view social context as fully explanatory of the formation of scientific belief and knowledge, while excluding the role of evidence.
There are Afrocentric historians who make factual claims that contradict existing historical evidence, such as the claim that Aristotle stole his philosophy from the library at Alexandria when, as Mary Lefkowitz points out, that library was not built until after Aristotle's death. Lefkowitz was shocked to get no support from her colleagues when she pointed out factual errors of this kind, and even more shocked when the dean of her college (Wellesley) told her that "each of us had a different but equally valid view of history." And so on (there's a lot of the "so on" in the book).
That sort of thing of course filters out into the rest of the world, not surprisingly: People go to university and emerge having picked up the kind of thought Lefkowitz's dean had picked up; such thoughts get into newspaper columns and magazine articles; and the rest of us munch them down with our cornflakes.
We don't quite think we could fly to the moon if we tried hard enough, but we may well think there's something a bit sinister and elitist about scientific knowledge, we may well think that oppressed and marginalized groups should be allowed their own "equally valid" view of history by way of compensation, we may well think "there's no such thing as truth, really."
Q: Your book describes and responds to a considerable range of forms of thought: old fashioned Pyrronic skepticism, "standpoint" epistemology, sociology of knowledge, neopragmatism, pomo, etc. Presumably not all questions about the possibility of a bedrock notion of truth are created equal. What kinds have a strong claim to serious consideration?
A: Actually, much of the range of thought we look at doesn't necessarily ask meta-questions about truth. A lot of it is more like second level or borrowed skepticism or relativism about truth, not argued so much as referenced, or simply assumed; waved at rather than defended. The truth relativism is not itself the point, it's rather a tool for the purpose of making truth-claims that are not supported by evidence or that contradict the evidence. Skepticism and relativism about truth in this context function as a kind of veil or smokescreen to obscure the way ideology shapes the truth-claims that are being made.
As a result much of the activity on the subject takes place on this more humdrum quotidian level, in between metaquestions and bedrock notions of truth, where one can ask if this map is accurate or not, if this bus schedule tells us where and when the bus really does go, if this history text contains falsifications or not, if the charges against this scholar or that tobacco company are based on sound evidence or not.
Meta-questions about truth of course do have a strong claim to serious consideration. Maybe we are brains in vats; maybe we all are, without realizing it, Keanu Reeves; there is no way to establish with certainty that we're not; thus questions on the subject do have a claim to consideration, however unresolvable they are. (At the same time, however unresolvable they are, it is noticeable that on the mundane level of this particular possible world, no one really does take them seriously; no one really does seriously doubt that fire burns or that axes chop.)
Intermediate level questions can also be serious, searching, and worth exploring. Standpoint epistemology is reasonable enough in fields where standpoints are part of the subject matter: histories of experience, of subjective views and mentalities, of oppression, for example, surely need at least to consider the subjective stance of the inquirer. Sociology of knowledge is an essential tool of inquiry into the way interests and institutions can shape research programs and findings, provided it doesn't, as a matter of principle, exclude the causative role of evidence. In short there are, to borrow a distinction of Susan Haack's, sober and inebriated versions of questions about the possibility of truth.
Q: Arguably even the most extremist forms of skepticism can have some beneficial effects -- if only indirectly, by raising the bar for what counts as a true or valid statement. (That's one thumbnail version of intellectual history, anyway: no Sextus Empiricus would mean no Descartes.) Is there any sense in which "epistemic relativism" might have some positive effect, after all?
A: Oh, sure. In fact I think it would be extremely hard to argue the opposite. And the ways in which it could have positive effects seem obvious enough. There's Mill's point about the need for contrary arguments in order to know the grounds of one's own views, for one. Our most warranted beliefs, as he says, have no safeguard to rest on other than a standing invitation to everyone to refute them.
If we know only our own side of the case, we don't know much. Matt Ridley made a related point in a comment on the Kitzmiller Intelligent Design trial for Butterflies and Wheels: "My concern ... is about scientific consensus. In this case I find it absolutely right that the overwhelming nature of the consensus should count against creationism. But there have been plenty of other times when I have been on the other side of the argument and seen what Madison called the despotism of the majority as a bad argument.... I agree with the scientific consensus sometimes but not always, but I do not do so because it is is a consensus. Science does not work that way or Newton, Harvey, Darwin and Wegener would all have been voted into oblivion."
Another way epistemic relativism may be of value is that it is one source (one of many) of insight into what it is that some people dislike and distrust about science and reason. In a way it's a silly argument to say that science is elitist or undemocratic, since it is of course the paradigmatic case of the career open to talent. But in another way it isn't silly at all, because as Michael Young pointed out in the '50s, meritocracy has some harsh side-effects, such as erosion of the sense of self-worth of the putative less talented. Epistemic relativism may function partly as a reminder of that.
The arguments of epistemic relativism may be unconvincing, but some of the unhappiness that prompts the arguments may be more worth taking seriously. However one then has to weigh those effects against the effects of pervasive suspicion of science and reason, and one grows pale with fear. At a time when there are so many theocrats and refugees from the reality-based community on the loose, epistemic relativism doesn't seem to need more encouragement than it already has.
Whether ‘tis nobler to plunge in and write a few Wikipedia entries on subjects regarding which one has some expertise; and also, p'raps, to revise some of the weaker articles already available there...
Or rather, taking arms against a sea of mediocrity, to mock the whole concept of an open-source, online encyclopedia -- that bastard spawn of “American Idol” and a sixth grader’s report copied word-for-word from the World Book....
Hamlet, of course, was nothing if not ambivalent –- and my attitude towards how to deal with Wikipedia is comparably indecisive. Six years into its existence, there are now something in the neighborhood of 2 million entries, in various languages, ranging in length from one sentence to thousands of words.
They are prepared and edited by an ad hoc community of contributors. There is no definitive iteration of a Wikipedia article: It can be added to, revised, or completely rewritten by anyone who cares to take the time.
Strictly speaking, not all wiki pages are Wikipedia entries. As this useful item explains, a wiki is a generic term applying to a Web page format that is more or less open to interaction and revision. In some cases, access to the page is limited to the members of a wiki community. With Wikipedia, only a very modest level of control is exercised by administrators. The result is a wiki-based reference tool that is open to writers putting forward truth, falsehood, and all the shades of gray in between.
In other words, each entry is just as trustworthy as whoever last worked on it. And because items are unsigned, the very notion of accountability is digitized out of existence.
Yet Wikipedia now seems even more unavoidable than it is unreliable. Do a search for any given subject, and chances are good that one or more Wikipedia articles will be among the top results you get back.
Nor is use of Wikipedia limited to people who lack other information resources. My own experience is probably more common than anyone would care to admit. I have a personal library of several thousand volumes (including a range of both generalist and specialist reference books) and live in a city that is home to at least to three universities with open-stack collections. And that’s not counting access to the Library of Congress.
The expression “data out the wazoo” may apply. Still, rare is the week when I don’t glance over at least half a dozen articles from Wikipedia. (As someone once said about the comic strip “Nancy,” reading it usually takes less time than deciding not to do so.)
Basic cognitive literacy includes the ability to evaluate the strengths and the limitations of any source of information. Wikipedia is usually worth consulting simply for the references at the end of an article -- often with links to other online resources. Wikipedia is by no means a definitive reference work, but it’s not necessarily the worst place to start.
Not that everyone uses it that way, of course. Consider a recent discussion between a reference librarian and a staff member working for an important policy-making arm of the U.S. government. The librarian asked what information sources the staffer relied on most often for her work. Without hesitation, she answered: “Google and Wikipedia.” In fact, she seldom used anything else.
Coming from a junior-high student, this would be disappointing. From someone in a position of power, it is well beyond worrisome. But what is there to do about it? Apart, that is, from indulging in Menckenesque ruminations about the mule-like stupidity of the American booboisie?
Sure, we want our students, readers, and fellow citizens to become more astute in their use of the available tools for learning about the world. (Hope springs eternal!) But what is to be done in the meantime?
Given the situation at hand, what is the responsibility of people who do have some level of competence? Is there some obligation to prepare adequate Wikipedia entries?
Or is that a waste of time and effort? If so, what’s the alternative? Or is there one? Luddism is sometimes a temptation – but, as solutions go, not so practical.
I throw these questions out without having yet formulated a cohesive (let alone cogent) answer to any of them. At one level, it is a matter for personal judgment. An economic matter, even. You have to decide whether improving this one element of public life is a good use of your resources.
At the same time, it’s worth keeping in mind that Wikipedia is not just one more new gizmo arriving on the scene. It is not just another way to shrink the American attention span that much closer to the duration of a subatomic particle. How you relate to it (whether you chip in, or rail against it) is even, arguably, a matter of long-term historical consequence. For in a way, Wikipedia is now 70 years old.
It was in 1936 that H.G. Wells, during a lecture in London, began presenting the case for what he called a “world encyclopedia” – an international project to synthesize and make readily available the latest scientific and scholarly work in all fields. Copies would be made available all over the planet. To keep pace with the constant growth of knowledge, it would be revised and updated constantly. (An essay on the same theme that Wells published the following year is available online.)
A project on this scale would be too vast for publication in the old-fashioned format of the printed book. Besides, whole sections of the work would be rewritten frequently. And so Wells came up with an elegant solution. The world encyclopedia would be published and distributed using a technological development little-known to his readers: microfilm.
Okay, so there was that slight gap between the Wellsian conception and the Wikipedian consummation. But the ambition is quite similar -- the creation of “the largest encyclopedia in history, both in terms of breadth and depth” (as the FAQ describes Wikipedia’s goal).
Yet there are differences that go beyond the delivery system. Wells believed in expertise. He had a firm faith in the value of exact knowledge, and saw an important role for the highly educated in creating the future. Indeed, that is something of an understatement: Wells had a penchant for creating utopian scenarios in which the best and the brightest organized themselves to take the reins of progress and guide human evolution to a new level.
Sometimes that vision took more or less salutary forms. After the first World War, he coined a once-famous saying that our future was a race between education and disaster. In other moods, he was prone to imagining the benefits of quasi-dictatorial rule by the gifted. What makes Wells a fascinating writer, rather than just a somewhat scary one, is that he also had a streak of fierce pessimism about whether his projections would work out. His final book, published a few months before his death in 1946, was a depressing little volume called The Mind at the End of Its Tether, which was a study in pure worry.
The title Wells gave to his encyclopedia project is revealing: when he pulled his various essays on the topic together into a book, he called it World Brain. The researchers and writers he imagined pooling their resources would be the faculty of a kind of super-university, with the globe as its campus. But it would do even more than that. The cooperative effort would effectively mean that humanity became a single gigantic organism -- with a brain to match.
You don’t find any of Wells’s meritocracy at work in Wikipedia. There is no benchmark for quality. It is an intellectual equivalent of the Wild West, without the cows or the gold.
And yet, strangely enough, you find imagery very similar to that of Wells’s “world brain” emerging in some of the more enthusiastic claims for Wikipedia. As the computer scientist Jaron Lanier noted in a recent essay, there is now an emergent sensibility he calls “a new online collectivism” – one for which “something like a distinct kin to human consciousness is either about to appear any minute, or has already appeared.” (Lanier offers a sharp criticism of this outlook. See also the thoughtful responses to his essay assembled by John Brockman.)
From the “online collectivist’ perspective, the failings of any given Wikipedia entry are insignificant. “A core belief in the wiki world,” writes Lanier, “is that whatever problems exist in the wiki will be incrementally corrected as the process unfolds.”
The problem being, of course, that it does not always work out that way. In 2004, Robert McHenry, the former editor-in-chief of the Encyclopedia Britannica,pointed out that, even after 150 edits, the Wikipedia entry on Alexander Hamilton would earn a high school student a C at best.
“The earlier versions of the article,” he noted, “are better written over all, with fewer murky passages and sophomoric summaries.... The article has, in fact, been edited into mediocrity.”
It is not simply proof of the old adage that too many cooks will spoil the broth. “However closely a Wikipedia article may at some point in its life attain to reliability,” as McHenry puts it, “it is forever open to the uninformed or semiliterate meddler.”
The advantage of Wikipedia’s extreme openness is that people are able to produce fantastically thorough entries on topics far off the beaten path. The wiki format creates the necessary conditions for nerd utopia. As a fan of the new “reimagined” "Battlestar Galactica," I cannot overstate my awe at the fan-generated Web site devoted to the show. Participants have created a sort of mini-encyclopedia covering all aspects of the program, with a degree of thoroughness and attention to accuracy matched by few entries at Wikipedia proper.
At the same time, Wikipedia is not necessarily less reliable than more prestigious reference works. A study appearing in the journal Nature found that Wikipedia entries on scientific topics were about as accurate as corresponding articles in the Encyclopedia Britannica.
And in any case, the preparation of reference works often resembles a sausage factory more than it does a research facility. As the British writer Joseph McCabe pointed out more than 50 years ago in a critique of the Columbia Encyclopedia, the usual procedure is less meritocratic than one might suppose. “A number of real experts are paid handsomely to write and sign lengthy articles on subjects of which they are masters,” noted McCabe, “and the bulk of the work is copied from earlier encyclopedias by a large number of penny-a-liners.”
Nobody writing for Wikipedia is “paid handsomely,” of course. For that matter, nobody is making a penny a line. The problems with it are admitted even by fans like David Shariatmadari, whose recent article on Wikipedia ended with an appeal to potential encyclopedists “to get your ideas together, get registered, and contribute.”
Well, okay ... maybe. I’ll think about it at least. There’s still something appealing about Wells’s vision of bringing people together “into a more and more conscious co-operating unity and a growing sense of their own dignity” – through a “common medium of expression” capable of “informing without pressure or propaganda, directing without tyranny.”
If only we could do this without all the semi-mystical globaloney (then and now) about the World Brain. It would also be encouraging if there were a way around certain problems -- if, say, one could be sure that different dates wouldn’t be given for the year that Alexander Hamilton ended his term as Secretary of the Treasury.
Late last week the Association of American University Presses held its annual meeting in New Orleans, or in what was left of it. Attendance is usually around 700 when the conference is held in an East Coast city. This time, just over 500 people attended, representing more than 80 presses -- a normal turnout, in other words, justifying the organizers’ difficult decision last fall not to change the location.
Inside the Sheraton Hotel itself, each day was a normal visit to Conference Land -- that well-appointed and smoothly functioning world where academic or business people (or both, in this case) can focus on the issues that bring them together. Stepping just outside, you were in the French Quarter. It wasn’t hit especially hard by last year’s catastrophic weather event. But there were empty buildings and boarded-up windows; the tourist-trap souvenir outlets offered a range of Katrina- and FEMA-themed apparel, with “Fixed Everything My Ass” being perhaps the most genteel message on sale.
The streets were not empty, but the place felt devitalized, even so. Only when you went outside the Quarter did the full extent of the remaining damage to the city really begin to sink in. On Thursday morning -- as the first wave of conference goers began to register -- a bus chartered by the association took a couple dozen of us around for a tour led by Michael Mitzell-Nelson and Greta Gladney (a professor and a graduate student, respectively, at the University of New Orleans). If the Quarter was bruised, the Ninth Ward was mangled.
It was overwhelming -– too much to take in. More imagery and testimony is available from the Hurricane Digital Memory Bank, a project sponsored by the University of New Orleans and the Center for History and New Media at George Mason University.
So you came back a little unsettled at the prospect of discussing business as usual. Then again, the prevailing idea at this year’s AAUP was that business has changed, and that university presses are rushing to catch up. The announced theme of the year’s program was “Transformational Publishing” -- with that titular buzzword covering the myriad ways that digital technologies affect the way we read now.
It was a far cry from the dismal slogan making the rounds at the AAUP meeting three years ago: “Flat is the new ‘up.’ ” In other words: If sales haven’t actually gone down, you are doing as well as can be expected. The cumulative effect of increasing production costs, budget cuts, and reduced library sales was a crisis in scholarly publishing. The lists of new titles got shorter, and staffs grew leaner; in a few cases, presses closed up shop.
I asked Peter Givler, the association’s executive director, if anyone was still using the old catch phrase. “Right now it looks like up is the new up,” he said. “It’s been a modest improvement, and we’re hearing from our members that there’s been a large return of books this spring. But it’s not like the slump that started in 2001.”
Cautious optimism, then, not irrational exuberance. While the word “digital” and its variants appeared in the title of many a session, it is clear that new media can be both a blessing and a curse. On the one hand, the association has been able to increase the visibility of its members’ output through the Books for Understanding Web site, which offers a convenient and reliable guide to academic titles on topics of public interest. (See, for example, this page on New Orleans.) At the same time, the market for university-press titles used in courses has been undercut by the ready availability of secondhand books online.
And then there’s Google Book Search. The AAUP has not joined the Authors Guild’s class action suit against Google for digitizing copyrighted materials. But university presses belong to the class of those with an interest in the case -- so the organization has incurred legal expenses while monitoring developments on behalf of its members. One got the definite impression that the other shoe may yet drop in this matter. During the business meeting, Givler indicated that the association would be undertaking a major action soon that would place additional demands on the organization's resources. I tried to find out more, but evidently its Board of Directors is playing its cards close to the vest for now.
With new obligations to meet, the board requested a 4 percent increase in membership dues. This was approved during the business meeting on Thursday. (Three members voting by proxy were opposed to it, but no criticism was expressed from the floor during the meeting itself.)
Proposals for longer-term changes in the organization’s structure and mission were codified in its new Strategic Plan (the first updating of the document since 1999). A working draft was distributed for discussion at the conference; the final version will be approved by the board in October.
This document -- not now available online -- conveys a very clear sense of the opportunities now open before university presses. (For “opportunities,” read also “stresses and strains.”)
It’s not just that technological developments are shaping how books get printed, publicized, and sold -- or even how we do research. A variety of new forms of scholarly publishing are emerging -- some of which make an end run around traditional university presses. “Societies, libraries, and other scholarly groups are now more likely to undertake publishing ventures themselves,” the proposal notes, “although they often lack the editing, marketing, and business skills found in abundance in university presses.”
Full membership in AAUP is restricted to presses that meet certain criteria, including “a faculty board that certifies the scholarly quality of the publications; a minimum number of publications per year; a minimum number of university-employed staff including a full-time director; and a statement of support from the parent organization.” But an ever larger number of learned publications – print, digital, or whatever – are issued by academic or professional enterprises that don’t follow this well-established model.
Indeed, if you hang around younger scholars long enough, it is a matter of time before someone begins pointing out that the old model might be jettisoned entirely. Why spend two years waiting for your monograph to appear from Miskatonic University Press when it might be made available in a fraction of that time through some combination of new media, peer review, and print-on-demand? No one broached such utopian ideas at AAUP (where, of course, they would be viewed as dystopian). But they certainly do get mooted. Sometimes synergy is not your friend.
The organization’s new strategic plan calls for reaching out to “nonprofit scholarly publishers and organizations whose interests and goals are compatible with AAUP” -- in part, by revising the membership categories and increasing the range of benefits. New members would be recruited through an introductory membership offer “open to small nonprofit publishers.”
These changes, if approved, will go into effect in July 2007. Apart from increasing the size of the association, they would bring in revenue -- thereby funding publicity, outreach, and professional-education programs. (One of the projects listed as “contemplated” is creation of “a ‘basic book camp’ to orient new and junior staff to working at a scholarly press.” I do like the sound of that.)
For the longer term, the intent is clearly to shore up the role of the university press’s established standards in an environment that seems increasingly prone to blowing them away.
“University presses,” the AAUP plan stresses, “are well positioned to be among the leaders in the academic community who help universities through a confusing and expensive new world. They can enhance the ability of scholars to research, add value to, and share their work with the broadest possible audiences, and they can help to develop intellectual property policies and behaviors sensible to all.”
Of course, not every discussion at the meeting was geared to the huge challenges of the not-too-distant future. Late Friday afternoon, I went to an interesting session called “Smoke, Mirrors, and Duct Tape: Nurturing a Small Press at a Major University.” It was a chance to discuss the problems that go with being a retro-style academic imprint at an institution where, say, people assume you are the campus print shop. (Or, worse, that you have some moral obligation to publish the memoirs of faculty emeritus.)
It was the rare case of an hour I spent in New Orleans without hearing any variation on the word “digital.” After getting home, I contacted one of the participants, Joseph Parsons, an acquisitions editor for the University of Iowa Press, to ask if that was just an oversight. Had academic digitality hit Iowa?
"We routinely deal with electronic files, of course," he responded, "but the books we produce have been of the old-fashioned paper and ink variety.... When we contract with authors, we typically include digital rights as part of the standard agreement, but we haven't published anything suitable for an electronic book reader."
He went on to mention, however, that print-on-demand was a ubiquitous and very reasonable option for small press runs. It was surprising that he made the point -- for a couple of reasons. POD now seems like an almost antique form of "new media," in the age of Web 2.0. I don't recall hearing it discussed in New Orleans, for example, except in passing. At the same time, it clearly fit into the plans of an old-school university press with a catalog emphasizing literature and some of the the less trend-obsessed quadrants of the humanities. It seemed like a reasonable compromise between sticking with what you already know and making a leap into the digital divide.
Anyway, I'm just glad to think there will continue to be books, at least for a while. As a matter of fact, while in New Orleans, I even bought a few. They were second-hand, admittedly, but it seemed as if the shop owner needed the business more than any of the university presses did.