Killing Peer Review

Smart Title: 
Can the social Web produce a "killer app" that would do away with the traditional editorial process at scholarly journals?

When a Journal Says No

Wendy Laura Belcher reviews your options on how to proceed.

Perishing Without Publishing

Rob Weir on what not to do if you want to see your paper published.

30 Writing Tips

Curtis J. Bonk offers advice for the start of an academic career.

Egghead Headshots

In the early 1970s, a French publisher issued a sort of photo album devoted to Jean-Paul Sartre, who was the most famous philosopher in the world. He had been for some while, so the photojournalistic dossier on him was quite full. The book is full of pictures of him alongside equally famous figures from the world stage -- Camus and Castro, for example, and Simone de Beauvoir, of course. You also see him in the midst of dramatic events, as when he addressed an assembly of revolutionary students during May ’68. There are a few images of the philosopher in a less public capacity. As I recall, there is a baby portrait or two. Plus there were pictures of the Sartrean babes, who seemed to get younger as he got older.

The man was a philosophical action figure, to be sure. But my favorite pages in the book show him at his desk, with manuscripts piled up precariously nearby, or at a café table, scribbling away. Sartre once said that he felt like a machine while working on The Critique of Dialectical Reason, grinding out each day’s quota of concepts. And that’s what’s happening in those photographs of him with pen in hand and tobacco pipe in jaw -- tuning out everything else but the hard work of philosophizing. But who knows? A photograph cannot document thought. It’s entirely possible that Sartre was updating his schedule to accommodate a new girlfriend, rather than analyzing Stalinism.

The same brain did both -- a fact that lends itself to philosophical inquiry. Just where do you draw the line between task-oriented thinking and whatever it is philosophers do while they are “doing philosophy”? It is a conundrum.

In his new book Philosophers, from Oxford University Press, the New Yorker photographer Steve Pyke assembles a portrait gallery of contemporary thinkers. It embodies a conundrum or two of its own -- beginning with the title. In 1995, the British press Zelda Cheatle issued a collection of Pyke’s photographs that was also called Philosophers, which now fetches a high price from secondhand dealers. These are, it bears stressing, completely distinct books. All but one of the pictures in the new collection were taken over the past decade. Only two images from the earlier volume appear in the new one -- in the introductory pages, separate from the hundred portraits making up the main body of the book.

So we have, in other words, two volumes of the same kind, on the same subject, by the same author. They bear the same title. And yet they are not identical. A teachable moment in metaphysics? Yes, but one with practical implications for the used-book trade: a certain percentage of people trying to buy the older volume online will end up getting really, really irritated.

The book from Oxford is quite handsome. And its status as an aesthetic object is not a minor consideration. (For that matter, its aesthetics as a status object are also pretty demanding. It feels like you should get a nicer coffee table, just to have someplace to put it.) Without going so far as to say that Pyke represents philosophers as a subcategory of the beautiful people, he certainly renders them in beautiful black and white.

Ethnography forms no part of what he has in mind: his photographs do not show subjects going about their daily routines or occupying their usual niches. It’s difficult to think of Sartre without picturing him in certain settings – bars, cafés, lecture halls, etc. Furthermore, these places aren’t just elements of his biography; they figure into his work (the waiter in Being and Nothingness is an obvious example). Pyke’s philosophers, by contrast, hang in the void. Usually they are set against a solid black backdrop. The one conspicuous exception is the portrait of Michael Friedman, with an unreadable chalkboard diagram behind him. Their heads loom like planets in the depths of space. The camera registers the texture of skin and hair, the expression on the lips and in the eyes. Scarcely anything else enters the frame -- an earring, perhaps, or the neck of a sweater. Most of the subjects look right into the camera, or just to the side.

With Pyke, the thinker becomes, simply, a face. The effect is intimate, but also strangely abstract. The place and date of the photo session is indicated, but the book provides no biographical information about the subjects. I recognized about a quarter of them off the top of my head, such as Robert Brandom, David Chalmers, Patricia Churchland, Arthur Danto, Sydney Morgenbesser, Richard Rorty. A couple are even on TV from time to time. Both Harry Frankfurt and Bernard-Henri Levy have been on "The Daily Show." That two or three pages could not be found to list a couple of books by each figure is puzzling, although most of the portraits are accompanied by very brief remarks by the subjects on the nature or motivation of their work.

“Philosophy is the way we have of reinventing ourselves,” says Sydney Morgenbesser. Ruth Millikan quotes Wilfrid Sellars from Science, Perception, and Reality: “The aim of philosophy, abstractly formulated, is to understand how things in the broadest possible sense of the term hang together in the broadest possible sense of the term.” Fortunately not everyone is so gnomic. The comments by Jerry Fodor seem the funniest: “To the best of my recollection, I became a philosopher because my parents wanted me to become a lawyer. It seems to me, in retrospect, that there was much to be said for their suggestion. On the other hand, many philosophers are quite good company; the arguments they use are generally better than the ones that lawyers use; and we do get to go to as many faculty meetings as we like at no extra charge.”

The ambivalence in Sally Haslanger’s statement felt more than vaguely familiar: “Given the amount of suffering and injustice in the world, I flip-flop between thinking that doing philosophy is a complete luxury and that it is an absolute necessity. The idea that it is something in between strikes me as a dodge. So I do it in the hope that it is a contribution, and with the fear that I’m just being self-indulgent. I suppose these are the moral risks life is made of.” That sounds quite a bit like Sartre, actually.

In the interview prefacing the collection, Pyke says that his intention is to make philosophers “seem more human, less of a mystery.” And that is where the true conundrum lies. Some philosophers look dyspeptic, while others have goofy smiles, but that isn’t what makes them human -- let alone philosophers. Making something “more human” precludes rendering it “less of a mystery,” since the human capacity for thought is itself an ever-deepening mystery.

Pyke thinks visually. A more interesting commentary on the figures in his portrait gallery might come indirectly, from the late Gilbert Ryle. An Oxford don and the author of The Concept of Mind, he gave a lecture that tried to sort out the relationship between deep cogitation and various other sorts of mental activity. To that end, he focused on the question of what that naked guy in Rodin's sculpture was doing -- and how it presumably differed from, say, a professor preparing to teach a class.

“The teacher has already mastered what he wants his students to master,” said Ryle. “He can guide them because he is on his own ground. But le Penseur is on ground unexplored by himself, and perhaps unexplored by anyone. He cannot guide himself through this jungle. He has to find his way without guidance from anyone who already knows it, if anyone does know it…. The teacher is a sighted leader of the blind, where le Penseur is a blind leader of the blind -- if indeed the very idea of his being or having a leader fits at all.”

That seems like a good description of what the subjects of Pyke's photographs spend their time doing. Not, of course, while the camera is turned on them. To judge by the expressions of some, their thoughts may have been something closer to, "Wow, I'm being photographed by someone from The New Yorker. How did that happen?"

Scott McLemee
Author's email:

Cafeteria Style

"You're too young to know about the cafeterias," said Julius Jacobson.

"The cafeterias were wonderful," said Phyllis Jacobson. "There's nothing like them today."

"The cafeterias and the automats were the center of New York intellectual life back then," they continued. Each one finishing the other's thought, as old couples often will. "You'd buy a sandwich or a piece of pie, both if you could afford it, but what you really went there to do was talk."

They talked. And I listened, hoping, as ever, to be transported into their past, at least for a while.

Phyllis and Julius had met as teenagers in the Young People's Socialist League during the late 1930s. They married after the war. Starting in the late 1940s onward, they worked on one small Marxist theoretical publication or another. They were public intellectuals long before anyone thought to coin that phrase, embodying a tradition of working-class self-education that was both non-academic and passionate about high culture. (Their devotion to the New York Review of Books bordered on the idolatrous, despite that publication's constant failure to adopt a suitably Jacobsonite political line.)

An old comrade of theirs once told me that, as a merchant seaman during World War II, he had been attracted to the Jacobson's group -- a small organization known as the Workers Party -- because its members read better novels than the Communists did. Being a revolutionary didn't mean you should wallow in mass culture. About 10 years ago, when I published some articles about recent television programs, Phyllis gave me a stern talking-to by telephone.

"Don't waste your time on popular culture," she said. "You need to write about serious things, philosophy and literature, not this trash." (Memory may be playing tricks, but I'd swear I could hear a Benny Goodman album playing in the background, on her end of the telephone line. Evidently not all pop culture was junk.)

The second anniversary of Julie's death is coming soon, and almost five years have passed since Phyllis had a stroke that left her unable to speak. New Politics, the journal they edited in the 1960s and ’70s, then revived in 1986 -- still struggles along, even without the two of them at the helm. It is probably a matter of time before some academic publisher takes over its production. That outcome is preferable to oblivion, of course, but it does seem at odds with the ethos of its founders.

We met in 1990. By coincidence, that was just about the time I started attending scholarly conferences. The contrast in demeanor and sensibility between the conversations in their living room and what I saw at those other gatherings was remarkable.

P&J (as one came to think of them) were argumentative, plain-spoken, and averse to the gestures meant to announce that one is (ahem!) a qualified expert. That hardly meant condoning intellectual sloppiness. They loved expertise, but not rigamarole. A manuscript by an academic on an interesting topic was always a source of pleasure to them. Above all else, P&J believed in the educated general public. That notion was essential to their version of left-wing politics. The thought that  you could be both “subversive” and incomprehensible to 90 percent of the audience made them laugh, not quite with joy.

It was P&J who explained an odd Yiddish idiom that I had come across: “to chop a tea kettle.” The image was puzzling. Why would anyone take an axe to a tea kettle? It seemed like a pointless thing to do. Which was exactly the point. “It means,” they said, “that a person makes a lot of noise without accomplishing anything.” (Perhaps it would be discreet not to mention just what examples we then discussed.)

How often that expression came to mind, in later years, as I sat in the audience for panels on “Postmodern This,” “Decentering That,” and “The Transgressive Potential of the Other Thing.” So many edgy theoretical axes! So many kettles, dented beyond all use.

At conferences, scholars would stand up and read their papers, one by one. Then the audience would “ask questions,” as the exercise is formally called. What that often meant, in practice, was people standing up to deliver short lectures on the papers they would have liked to have heard, instead -- and presumably would have delivered, had they been invited.

Hypothetically, if everyone on a panel read one another’s papers beforehand, they might be able to get some lively cross-talk going. This does happen in some of the social sciences, but it seems never to occur among humanities scholars. The whole process seems curiously formal, and utterly divorced from any intent to communicate. A routine exercise, or rather perhaps an exercise in routinism. A process streamlined into grim efficiency, yielding one more line on the scholar’s vita.

Is this unfair? No doubt it is. Over the years, I have heard some excellent and exciting papers at conferences. There have been whole sessions when everyone in the room was awake, and not just in the technical sense. But such occasions are the happy exceptions to the norm.

The inner dynamic of these gatherings is peculiar, but not especially difficult to understand. They are extremely well-regulated versions of what Erving Goffman called “face work” -- an “interaction ritual” through which people lay claim to a given social identity. Thanks to the steady and perhaps irreversible drive to “professionalization,” the obligation to perform that ritual now comes very early in a scholar’s career.

And so the implicit content of many a conference paper is not, as one might think, “Here is my research.” Rather, it is: “Here am I, qualified and capable, performing this role, which all of us here share, and none of us want to question too closely. So let’s get it over with, then go out for a drink afterwards.”

With Phyllis and Julius, as with others of their generation and cohort, the ebb and flow of discourse was very different. It is not that they had no Goffmanian interaction rituals, but the rituals were different. The cafeteria had imposed its own style. The frantic pace of defining one’s area of specialization, acquiring the proper credentials, and passing through an obligatory series of disciplinary enactments of competence (aka “conferences”) -- and doing all this, preferably, in one’s 20s -- would have been utterly out of place, over pie.

Instead, the cafeteria fostered a style in which the tone of authority had to be assumed with some care. There was always someone nearby, waiting to ambush you with an unfamiliar fact, a sarcastic characterization of your argument, a book he had just carried over from the library with purpose of shutting you up for good, or at least for the rest of the afternoon. (“Now where is it you say Lenin wrote that? It sure isn’t here!”) You had to think on your feet, to see around the corner of your own argument. And if you were smart, you knew to make a point quickly, cleanly, punching it home. The Jacobsons introduced me to a valuable expression, one that neatly characterizes the opening moves of many an academic text: “throat-clearing noises.”

Now, it’s best not to sentimentalize the cafeteria and its circumstances, at least not too much. In the 1930s and ’40s, smart people didn’t loiter with intent to argue just because they enjoyed the prospect of constituting a “free floating intelligentsia.” They were there for economic reasons. The food was cheap, the jobs were scarce. Academe was nothing like the factor in the nation’s economic life that it is today, and few saw a career there as an option. The hiring of Lionel Trilling in the English department at Columbia in 1932 had provoked concern among the faculty; he was, after all, as someone put it, “a Marxist, a Freudian, and a Jew.” If you had a name like Jacobson, you knew the cards were stacked against you.

Nor was the discursive style of the cafeteria intelligentsia all brilliant rhetorical fireworks and dialectical knife-juggling. I suspect that, after a while, the arguments and positions began to congeal and harden, becoming all too familiar. And the familiar gambit of “you lack the theoretical sophistication to follow my argument” seems to have had its place in cafeteria combat.

One faction in the Jacobsons’ circle insisted that you had to study German philosophy to understand anything at all about Marx’s economics. Fifty years later, P&J still sounded exasperated at the memory. “These kids could barely read,” Phyllis said, “and they’d be lugging Hegel around.”

So maybe a paradise of the unfettered mind it wasn’t. Still, in reading academic blogs over the past couple of years, I’ve often wondered if something like the old style might not be rousing itself from the dustbin of history.

For one thing, important preconditions have reemerged -- namely, the oversupply of intellectual labor relative to adequate employment opportunities. The number of people possessing extremely sophisticated tools in the creation, analysis, and use of knowledge far exceeds the academic channels able to absorb them.

Furthermore, the self-sustaining (indeed, self-reinforcing) regime of scholarly professionalization may be just a little too successful to survive. Any highly developed bureaucracy imposes its own rules and outlook on those who operate within it. But people long subjected to that system are bound to crave release from its strictures.

For every scholar wondering how to make blogging an institutionally accredited form of professional activity, there must be several entertaining the vague hopes that it never will.

The deeper problem, perhaps, is the one summed up very precisely in a note from a friend that arrived just as I finished writing this: “Do you think there’s any way that intellectual life in America could become less lonely?”

I jot these thoughts down, wondering what Phyllis and Julius would make of them -- a question that darkens many an otherwise happy moment, nowadays. One thing seems certain: P&J would want to argue.

“Blogs are nothing like the cafeteria,” they might say. “Well, maybe a little, but not that much. Go ahead though, Scott. Go ahead and chop that kettle.”

Scott McLemee
Author's email:

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Free Refills

Any day now, I should get some business cards from the Inside Higher Ed headquarters, announcing to the world -- or at least to anyone who asks for one -- my new position, which is "Essayist at Large."

It is a title with a certain pleasing vagueness of mandate. I feel a bit like Diogenes Teufelsdrockh, the (imaginary) German philosopher portrayed in Thomas Carlyle's satirical book Sartor Resartus, who held the rank of Professor of Things in General.

The plan is for this column to push intellectual generalism as hard as it will go. Intellectual Affairs will be a forum for discussing academic books (old and new) and scholarly journals (ditto). I'll track down dissertations and conference papers that deserve a larger audience, and report on what's happening in the world of think tanks, humanities centers, literary quarterlies, and online e-zines. Nor, of course, will we neglect the terrain known as the blogosphere -- that agonistic realm routinely combining the best qualities of the academic seminar with the worst traits of talk radio.

The sudden shift from "I" to "we" in that last sentence was no accident. I am counting on eagle-eyed readers to point out things meriting attention. The essay form is at its most interesting when it becomes "polyphonic," as the Soviet-era cultural theorist Mikhail Bakhtin put it -- a space in which a number of voices coincide and enter dialogue.

To be sure, this column will provide its share of what people at newspapers sometimes call "thumbsucking." (Journalism, like scholarship, has its own jargon.) As the novelist and critic Wilfred Sheed once defined it, a thumbsucker is an essay "presenting no new information, but merely revealing the beauty of the writer's mind."

Well, ouch. But fair enough. A little of that sort of thing goes a long way, though. So this space will remain as open as possible to the "blooming, buzzing confusion" of contemporary intellectual life. For one thing, there will be interviews with scholars, professors, and/or thoughtful writers. (Those categories overlaps, but are not quite identical.) And your thoughts on what is afoot in your field are also welcome. I promise to read e-mail between trips to the Library of Congress and occasional bouts of navel-gazing.

As Carlyle recounts it, the city fathers of Weissnichtwo invited Teufelsdrockh to join the faculty of their newly opened university because they felt that "in times like ours -- the topsy-turvey early 19th century -- all things are, rapidly or slowly, resolving themselves into Chaos." The interdisciplinary field of Allery-Wissenchaft (the Science of Things in General) might help set the world straight.

Unfortunately, "they had only established the Professorship, [not] endowed it." And so students didn't see much of him -- except at the coffeehouse, where "he sat reading Journals; sometimes contemplatively looking into the clouds of his tobacco-pipe, without other visible employment."

When, at long last, Teufelsdrockh published his great philosophical treatise, it was "a mixture of insight [and] inspiration, with dullness, double-vision, and even utter blindness."

The previous owner of my copy of Sartor Resartus underlined this passage, and scribbled a note in the margin wondering if it might have been a source for the title of Paul de Man's seminal volume of essays on literary theory from 1971, Blindness and Insight, published 140 years after Carlyle's satire appeared.

An interesting conjecture, hereby commended to the attention of experts.

Rereading that passage just now, however, I faced a more pressing question. Will this column provide the right ratio of insight and inspiration to "dullness, double-vision, and even utter blindness?"

Well, ouch again. But then, such are the risks one takes in practicing Allery-Wissenchaft. See you again on Thursday. In the meantime, I'll be at the coffeehouse, either thinking deep thoughts or just staring off into space. Fortunately the refills are free.

Scott McLemee
Author's email:

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Defending Derrida

On Sunday, about 200 people crowded into the Jacob Burns Moot Court of the Cardozo School of Law in New York City to speak of Jacques Derrida -- a.k.a. "Jackie" and "JD" -- at a conference called "Derrida/America: The Present State of America's Europe." The throng was down to a third of that size by Monday morning. Maybe everyone else went off to see "The Gates," Christo's installation of saffron banners running around Central Park. The installation wouldn't last forever, while the job of sorting out the legacy of deconstruction might take a while.

Certainly the dominant note of the event (a gathering "in celebration and mourning," as a few speakers put it) was to insist that Derrida's work deserved more serious notice than it had received in the American press following his death in September. In welcoming the audience, Peter Goodrich, a professor of law at Cardozo, noted that people who were "unimpeded by any knowledge of what they're talking about" evidently felt an especially passionate urge to denounce Derrida. Although no speaker mentioned it as such, the most egregious example was undoubtedly the obituary in The New York Times -- a tour de force of malice and intellectual laziness, by someone whose entire knowledge of Derrida's work appeared to have been gleaned from reading the back of a video box for the Woody Allen film Deconstructing Harry.

But the problem is not simply with the American public at large. "There is something I've wanted to say in public for some time," announced Simon Critchley, a professor of philosophy at New School University. "The treatment of Derrida by philosophers in the Anglophone world was shameful. They weren't stupid. They knew better. They hadn't read Derrida, and they knew they hadn't. But philistinism -- combined with envy at Derrida for being smart, charismatic, good looking, and a snappy dresser -- made them behave in a way that was, there is no other word for it, shameful."

The crowd applauded. "Now I feel better," he said.

Posthumous compliments for Derrida, and cathartic insults for his enemies, were only a small part of the program. Speakers came back repeatedly to "Force of Law: The 'Mystical Foundation of Authority' " -- a lecture on the complex and contradictory relationship between law and justice that Derrida gave in 1989, at a colloquium called "Deconstruction and the Possibility of Justice," held at Cardozo, the law school of Yeshiva University. Derrida's paper is now most readily available now in Acts of Religion, a collection of his writings published by Routledge.

Among the scores of books and essays that Derrida published over the final 15 years of his life, "Force of Law" looms as one of the most important. In 2003, not long before he was diagnosed with pancreatic cancer, Derrida published a book on the possibility of global democracy called Rogues, just released in an English translation from Stanford University Press. Many of its themes were anticipated in the Cardozo lecture, making "Force of Law" almost a prerequisite to understanding Derrida's final book. (Or so I figured out the hard way, a few months ago, by reading Rogues first.)

"What is currently called deconstruction," said Derrida in 1989, "would not at all correspond (though certain people have an interest in spreading this confusion) to a quasi-nihilistic abdication before the ethico-politico-juridical question of justice and before the opposition between just and unjust...."

His goal, in effect, is to point to a notion of justice that would be higher than any given code of laws. Likewise, in other late writings, Derrida seeks to define a notion of forgiveness that would be able to grapple with the unforgivable. And, he asks, might it be the case that Levantine traditions of hospitality (of welcoming the Other into one's home) transcend more modern conceptions of ethics?

For someone constantly accused of relativism, Derrida often sounds in these late works like a man haunted by the absolute. There is a sense in which, although he was an atheist, he practiced what a medieval scholar might have recognized as "negative theology" -- an effort to define the nature of God by cutting away all the misleading conceptions imposed by the limits of human understanding.

The implications were political, at least in some very abstract sense. In his keynote talk at the American Academy of Religion in 2002, Derrida proposed a notion of God that, in effect, utterly capsized the familiar world of monotheism by stripping it of all our usual understandings of divine authority. Suppose God were not the all powerful king of the universe (the image that even an atheist is prone to imagine upon hearing the name "God"). Suppose, rather, that God were infinitely weak, utterly vulnerable. What then? What would it mean that human beings are made in His image?

Such moments in Derrida's work could be very moving. Or they could be very irritating. At the Cardozo conference, it sounded at times as if the jury were still out on "Force of Law." Some speakers indicated that the lecture had radically transformed the way they understood legal theory, while a couple of dissenters suggested that Derrida had at most made a very late contribution to the school known as critical legal studies -- or even served up "warmed over legal realism" with a rich French sauce.

The oddest and most contentious turn in the discussion may have been the remarks of Jack Balkin, a professor of constitutional law at Yale, who, in a sardonic way, implied that there might be a hotbed of deconstructionist legal thought in the Bush administration. He sketched an outline of Derrida's formulation of three "aporias" (that is, unpassable points or double binds) in the relationship between justice and law.

For example, there is the aporia that Derrida calls "singularity." The law consists of general rules, and to be just, those rules must be equally binding on everyone. Yet while it is illegal to kill another person, it would be unjust to impose the same penalty on an assassin and someone defending herself from attack. Thus, justice exceeds even a just law.

Likewise, Derrida pointed to the aporia of "undecidability" --  the law guides the judge's decision, but the judge must decide which particular laws apply in a given case. And there is an aporia of "urgency" -- for while the legal process unfolds in time, "justice," as Derrida put it, "is that which must not wait." In each case, justice requires the law, but exceeds it.

"The Justice Department," said Balkin, "has invoked all three aporias of law" in the "war on terror." He ran through the list quickly: The suspension of due process in some cases (singularity). The government must have the discretion to apply the law as it sees fit, given its knowledge of circumstances (undecidability). And justice demands swift, even immediate action (urgency). "I am afraid that Bush has hoisted Derrida by his own aporias," said Balkin.

Of course this formulation did not go unchallenged by members of the audience during the discussion afterward. But it did call to mind something that Peter Goodrich had said earlier, in recalling Derrida's first visit to Cardozo. "Law school depressed him," as Goodrich put, "both the environment and the inhabitants." Perhaps it was, at best, a distraction from the philosophical pursuit of pure justice, in all its impossible beauty.

On Thursday, Derrida, the university, global democracy, and some flashbacks to the 1980s, when the abyss was just a seminar away.

Scott McLemee
Author's email:

Academic Freedom, Then and Now

This year marks the 50th anniversary of The Development of Academic Freedom in the United States by Richard Hofstadter and Walter P. Metzger, published by Columbia University Press. It has long been out of print. But circumstances have had the unfortunate effect of making it timely again. Locating a copy is worth the trouble, and once you do, the book proves just about impossible to put down.

For one thing, reading it is a relief from the mingled stridencies of l'affaire Ward Churchill and of David Horowitz's latest stunt, the so-called "Academic Bill of Rights." (By the way, is it just me, or don't their media performances suggest that Churchill and Horowitz are identical twins whom ideology has separated at birth? Both have glint-eyed zealotry down pat.)

At the same time, the book is a reminder of how incredibly prolonged, complicated, and perilous the emergence of academic freedom has been. The book was commissioned in 1951 by the American Academic Freedom Project, which had a panel of advisers from numerous distinguished universities and seminaries (plus one from the Detroit Public Library), and it was published alongside a companion volume, Academic Freedom in Our Time, by the director of the project, R. M. MacIver, an emeritus professor of political philosophy and sociology at Columbia University.

It was, in brief, the closest thing to an official scholarly response to the danger of McCarthyism from the university world. The authors must have finished correcting proofs for the book around the time Joseph McCarthy lost his committee chairmanship and was censured by his colleagues in the Senate. The darkness of the time is particularly evident in MacIver's volume, with its conclusion that "the weight of authority in the United States is now adverse to the principle of intellectual freedom."

Hofstadter and Metzger, by contrast, make only a few direct references to the then-recent challenges to academic freedom. Senator McCarthy's name never appears in the book. Hofstadter traces the history of American academic life up to the Civil War, and Metzger continues it through the early 20th century -- a panoramic survey recounting scores of controversies, firings, and pamphlets wars. But recording only "the outstanding violations of freedom" would mean reducing history to "nothing but the story of academic suppression."

Condensing 500 pages into five paragraphs is a fool's errand, but here goes anyway.

The belief that only the community of scholars has the final authority to determine what counts as valid research or permissible speech has deep roots in the history of the university, going all the way back to its origins in medieval Europe. But it was extremely slow to develop in colonial and antebellum America, which had few institutions of higher learning that were anything but outgrowths of religious denominations.

In 1856, George Templeton Strong suggested to his fellow trustees of what was then Columbia College that the only way to create a great university was "to employ professors of great repute and ability to teach" and "confiding everything, at the outset, to the control of the teachers." It was an anomalous idea -- one that rested, Hofstadter indicates, on the idea that scholarship might confer social prestige to those who practice it.

As the later chapters by Walter Metzger argue, it was only with the rapid increase in endowments (and the growing economic role of scientific research and advanced training) that academics began to have the social status necessary to make strong claims for their own autonomy as professionals.

At least some of what followed sounds curiously familiar. "Between 1890 and 1900," writes Metzger, "the number of college and university teachers in the United States increased by fully 90 percent. Though the academic market continually expanded, a point of saturation, at least in the more attractive university positions, was close to being reached.... Under these competitive conditions, the demand for academic tenure became urgent, and those who urged it became vociferous." It was the academic equivalent of the demand for civil-service examinations in government employment and for rules of seniority in other jobs.

Academic freedom was not so much the goal for the creation of tenure as one of its desirable side effects. The establishment of the American Association of University Professors in 1915 "was the culmination of tendencies toward professorial self-consciousness that had been operating for many decades." And it was the beginning of the codification of rules ensuring at least some degree of security (however often honored only in the breach) for those with unpopular opinions.

Speaking of unpopular opinions, I must admit to feeling some uneasiness in recommending The Development of Academic Freedom in the United States to you.

It is a commonplace today that Richard Hofstadter was a Cold War liberal -- and a certain smug knowingness about the limitations and failures of Cold War liberalism is the birthright of every contemporary academic, of whatever ideological coloration. Furthermore, Hofstadter stands accused of indulging in "the consensus view of history," which sees the American political tradition as endorsing (as one scholar puts it) "the rights of property, the philosophies of economic individualism, [and] the value of competition."

I don't know anything about Walter Metzger, but he seems to share much of Hofstadter's outlook. So it is safe to dismiss their book as a mere happy pill designed to induce the unthinking celebration of the American way of life. No one will think the worse of you for this. Besides, we're all so busy nowadays.

But if you do venture to read The Development of Academic Freedom, you might find its analysis considerably more combative than it might at first appear. Its claim is not that academic freedom is a deeply rooted part of our glorious American heritage of nearly perfect liberty. The whole logic of its argument runs very much to the contrary.

Someone once said that the most impressive thing about Hofstadter's Anti-Intellectualism in American Life (1963) was that he managed to keep it to just one volume. The deepest implication of his work is that academic freedom does not, in fact, have very deep roots even in the history of American higher education -- let alone in the wider culture.

On the final page of The Development of Academic Freedom in the United States, his collaborator writes, "One cannot but be but be appalled at the slender thread by which it hangs.... one cannot but be disheartened by the cowardice and self-deception that frail men use who want to be both safe and free." It is a book worth re-reading now -- not as a celebration, but as a warning.

Scott McLemee
Author's email:

Substitute Teachers

With college enrollments growing, tuition soaring, and administrators reaching for their chain saws to cut costs, the role of the tenured professor is under fire as never before.

"The days of the royal professorship are, like, so over," proclaims Aventa Clew, chief of human resources at Awed State. "Who can even afford a tenured faculty member when we're outsourcing jobs to Cuba, or wherever I'm thinking of?" The Awed response, anxiously watched by schools across the country, is to create "a more intermediate pedagogy," as Awed Dean of Conservative Arts Iona Bentley put it, "somewhere between instructors and serfs."

Starting next fall, the bulk of incoming students will be taught by a new cadre of professionals called assistant assistants, teachers whose sole responsibility will be in the classroom: no office, no bathroom privileges, and most important, no benefits. A sub-category to assistant assistants may be recruited from the ranks of the new never-graduate students, a guild of craft-persons, particularly medieval studies types in the history department, dedicated to staying within the walls of the academy.

At lower-cost institutions, those who can't quite teach but merely impart information to students will be hired as drones, moonlighting from their regular jobs as greeters at Wal-Mart. The new ranks may take hold soonest in Texas, where the Leave No Teacher Behind initiative has been implemented in a chain of retraining camps.Of course, drones can be prerecorded, an idea that has not gone unnoticed in education departments across the country.

At C.I.T.M.T., the California Institute of Too Much Technology, employing the same animation techniques that made The Polar Express such an enhanced miracle of sound and motion, the computer labs have started to produce virtual professors. The v.p.'s, as they're known in the trade, can perform functions that traditional pedagogues can only dream of: executing a brutal savate kick to emphasize a point about physics, or morphing into Grendel while reciting Beowulf. The newest version, Prof5000, can execute 100 pedagogical decisions a second while also composing an abstract and serving on a committee for academic freedom. In the works are plans to produce virtual students, as well, and miniaturize entire endowed buildings to the dimensions of a computer chip.

When polled, C.I.T.M.T. students said they didn't think it would affect their learning experience. "If I'd wanted a human teacher," scoffs one sophomore engineering student who did in fact ask to be named, "I'd have gone to a community college."

Will the C.I.T.M.T. administration ever be replaced by a machine? "Of course not," said one C.I.T.M.T. official. "Our work is far too important for that. In fact, we're hiring 17 new deans next year."

David Galef
Author's email:

David Galef is a professor of English and administrator of the M.F.A. program in creative writing at the University of Mississippi. His latest book is the short story collection Laugh Track (2002).


Subscribe to RSS - Publishing
Back to Top