Publishing

30 Writing Tips

Curtis J. Bonk offers advice for the start of an academic career.

Egghead Headshots

In the early 1970s, a French publisher issued a sort of photo album devoted to Jean-Paul Sartre, who was the most famous philosopher in the world. He had been for some while, so the photojournalistic dossier on him was quite full. The book is full of pictures of him alongside equally famous figures from the world stage -- Camus and Castro, for example, and Simone de Beauvoir, of course. You also see him in the midst of dramatic events, as when he addressed an assembly of revolutionary students during May ’68. There are a few images of the philosopher in a less public capacity. As I recall, there is a baby portrait or two. Plus there were pictures of the Sartrean babes, who seemed to get younger as he got older.

The man was a philosophical action figure, to be sure. But my favorite pages in the book show him at his desk, with manuscripts piled up precariously nearby, or at a café table, scribbling away. Sartre once said that he felt like a machine while working on The Critique of Dialectical Reason, grinding out each day’s quota of concepts. And that’s what’s happening in those photographs of him with pen in hand and tobacco pipe in jaw -- tuning out everything else but the hard work of philosophizing. But who knows? A photograph cannot document thought. It’s entirely possible that Sartre was updating his schedule to accommodate a new girlfriend, rather than analyzing Stalinism.

The same brain did both -- a fact that lends itself to philosophical inquiry. Just where do you draw the line between task-oriented thinking and whatever it is philosophers do while they are “doing philosophy”? It is a conundrum.

In his new book Philosophers, from Oxford University Press, the New Yorker photographer Steve Pyke assembles a portrait gallery of contemporary thinkers. It embodies a conundrum or two of its own -- beginning with the title. In 1995, the British press Zelda Cheatle issued a collection of Pyke’s photographs that was also called Philosophers, which now fetches a high price from secondhand dealers. These are, it bears stressing, completely distinct books. All but one of the pictures in the new collection were taken over the past decade. Only two images from the earlier volume appear in the new one -- in the introductory pages, separate from the hundred portraits making up the main body of the book.

So we have, in other words, two volumes of the same kind, on the same subject, by the same author. They bear the same title. And yet they are not identical. A teachable moment in metaphysics? Yes, but one with practical implications for the used-book trade: a certain percentage of people trying to buy the older volume online will end up getting really, really irritated.

The book from Oxford is quite handsome. And its status as an aesthetic object is not a minor consideration. (For that matter, its aesthetics as a status object are also pretty demanding. It feels like you should get a nicer coffee table, just to have someplace to put it.) Without going so far as to say that Pyke represents philosophers as a subcategory of the beautiful people, he certainly renders them in beautiful black and white.

Ethnography forms no part of what he has in mind: his photographs do not show subjects going about their daily routines or occupying their usual niches. It’s difficult to think of Sartre without picturing him in certain settings – bars, cafés, lecture halls, etc. Furthermore, these places aren’t just elements of his biography; they figure into his work (the waiter in Being and Nothingness is an obvious example). Pyke’s philosophers, by contrast, hang in the void. Usually they are set against a solid black backdrop. The one conspicuous exception is the portrait of Michael Friedman, with an unreadable chalkboard diagram behind him. Their heads loom like planets in the depths of space. The camera registers the texture of skin and hair, the expression on the lips and in the eyes. Scarcely anything else enters the frame -- an earring, perhaps, or the neck of a sweater. Most of the subjects look right into the camera, or just to the side.

With Pyke, the thinker becomes, simply, a face. The effect is intimate, but also strangely abstract. The place and date of the photo session is indicated, but the book provides no biographical information about the subjects. I recognized about a quarter of them off the top of my head, such as Robert Brandom, David Chalmers, Patricia Churchland, Arthur Danto, Sydney Morgenbesser, Richard Rorty. A couple are even on TV from time to time. Both Harry Frankfurt and Bernard-Henri Levy have been on "The Daily Show." That two or three pages could not be found to list a couple of books by each figure is puzzling, although most of the portraits are accompanied by very brief remarks by the subjects on the nature or motivation of their work.

“Philosophy is the way we have of reinventing ourselves,” says Sydney Morgenbesser. Ruth Millikan quotes Wilfrid Sellars from Science, Perception, and Reality: “The aim of philosophy, abstractly formulated, is to understand how things in the broadest possible sense of the term hang together in the broadest possible sense of the term.” Fortunately not everyone is so gnomic. The comments by Jerry Fodor seem the funniest: “To the best of my recollection, I became a philosopher because my parents wanted me to become a lawyer. It seems to me, in retrospect, that there was much to be said for their suggestion. On the other hand, many philosophers are quite good company; the arguments they use are generally better than the ones that lawyers use; and we do get to go to as many faculty meetings as we like at no extra charge.”

The ambivalence in Sally Haslanger’s statement felt more than vaguely familiar: “Given the amount of suffering and injustice in the world, I flip-flop between thinking that doing philosophy is a complete luxury and that it is an absolute necessity. The idea that it is something in between strikes me as a dodge. So I do it in the hope that it is a contribution, and with the fear that I’m just being self-indulgent. I suppose these are the moral risks life is made of.” That sounds quite a bit like Sartre, actually.

In the interview prefacing the collection, Pyke says that his intention is to make philosophers “seem more human, less of a mystery.” And that is where the true conundrum lies. Some philosophers look dyspeptic, while others have goofy smiles, but that isn’t what makes them human -- let alone philosophers. Making something “more human” precludes rendering it “less of a mystery,” since the human capacity for thought is itself an ever-deepening mystery.

Pyke thinks visually. A more interesting commentary on the figures in his portrait gallery might come indirectly, from the late Gilbert Ryle. An Oxford don and the author of The Concept of Mind, he gave a lecture that tried to sort out the relationship between deep cogitation and various other sorts of mental activity. To that end, he focused on the question of what that naked guy in Rodin's sculpture was doing -- and how it presumably differed from, say, a professor preparing to teach a class.

“The teacher has already mastered what he wants his students to master,” said Ryle. “He can guide them because he is on his own ground. But le Penseur is on ground unexplored by himself, and perhaps unexplored by anyone. He cannot guide himself through this jungle. He has to find his way without guidance from anyone who already knows it, if anyone does know it…. The teacher is a sighted leader of the blind, where le Penseur is a blind leader of the blind -- if indeed the very idea of his being or having a leader fits at all.”

That seems like a good description of what the subjects of Pyke's photographs spend their time doing. Not, of course, while the camera is turned on them. To judge by the expressions of some, their thoughts may have been something closer to, "Wow, I'm being photographed by someone from The New Yorker. How did that happen?"

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Cafeteria Style

"You're too young to know about the cafeterias," said Julius Jacobson.

"The cafeterias were wonderful," said Phyllis Jacobson. "There's nothing like them today."

"The cafeterias and the automats were the center of New York intellectual life back then," they continued. Each one finishing the other's thought, as old couples often will. "You'd buy a sandwich or a piece of pie, both if you could afford it, but what you really went there to do was talk."

They talked. And I listened, hoping, as ever, to be transported into their past, at least for a while.

Phyllis and Julius had met as teenagers in the Young People's Socialist League during the late 1930s. They married after the war. Starting in the late 1940s onward, they worked on one small Marxist theoretical publication or another. They were public intellectuals long before anyone thought to coin that phrase, embodying a tradition of working-class self-education that was both non-academic and passionate about high culture. (Their devotion to the New York Review of Books bordered on the idolatrous, despite that publication's constant failure to adopt a suitably Jacobsonite political line.)

An old comrade of theirs once told me that, as a merchant seaman during World War II, he had been attracted to the Jacobson's group -- a small organization known as the Workers Party -- because its members read better novels than the Communists did. Being a revolutionary didn't mean you should wallow in mass culture. About 10 years ago, when I published some articles about recent television programs, Phyllis gave me a stern talking-to by telephone.

"Don't waste your time on popular culture," she said. "You need to write about serious things, philosophy and literature, not this trash." (Memory may be playing tricks, but I'd swear I could hear a Benny Goodman album playing in the background, on her end of the telephone line. Evidently not all pop culture was junk.)

The second anniversary of Julie's death is coming soon, and almost five years have passed since Phyllis had a stroke that left her unable to speak. New Politics, the journal they edited in the 1960s and ’70s, then revived in 1986 -- still struggles along, even without the two of them at the helm. It is probably a matter of time before some academic publisher takes over its production. That outcome is preferable to oblivion, of course, but it does seem at odds with the ethos of its founders.

We met in 1990. By coincidence, that was just about the time I started attending scholarly conferences. The contrast in demeanor and sensibility between the conversations in their living room and what I saw at those other gatherings was remarkable.

P&J (as one came to think of them) were argumentative, plain-spoken, and averse to the gestures meant to announce that one is (ahem!) a qualified expert. That hardly meant condoning intellectual sloppiness. They loved expertise, but not rigamarole. A manuscript by an academic on an interesting topic was always a source of pleasure to them. Above all else, P&J believed in the educated general public. That notion was essential to their version of left-wing politics. The thought that  you could be both “subversive” and incomprehensible to 90 percent of the audience made them laugh, not quite with joy.

It was P&J who explained an odd Yiddish idiom that I had come across: “to chop a tea kettle.” The image was puzzling. Why would anyone take an axe to a tea kettle? It seemed like a pointless thing to do. Which was exactly the point. “It means,” they said, “that a person makes a lot of noise without accomplishing anything.” (Perhaps it would be discreet not to mention just what examples we then discussed.)

How often that expression came to mind, in later years, as I sat in the audience for panels on “Postmodern This,” “Decentering That,” and “The Transgressive Potential of the Other Thing.” So many edgy theoretical axes! So many kettles, dented beyond all use.

At conferences, scholars would stand up and read their papers, one by one. Then the audience would “ask questions,” as the exercise is formally called. What that often meant, in practice, was people standing up to deliver short lectures on the papers they would have liked to have heard, instead -- and presumably would have delivered, had they been invited.

Hypothetically, if everyone on a panel read one another’s papers beforehand, they might be able to get some lively cross-talk going. This does happen in some of the social sciences, but it seems never to occur among humanities scholars. The whole process seems curiously formal, and utterly divorced from any intent to communicate. A routine exercise, or rather perhaps an exercise in routinism. A process streamlined into grim efficiency, yielding one more line on the scholar’s vita.

Is this unfair? No doubt it is. Over the years, I have heard some excellent and exciting papers at conferences. There have been whole sessions when everyone in the room was awake, and not just in the technical sense. But such occasions are the happy exceptions to the norm.

The inner dynamic of these gatherings is peculiar, but not especially difficult to understand. They are extremely well-regulated versions of what Erving Goffman called “face work” -- an “interaction ritual” through which people lay claim to a given social identity. Thanks to the steady and perhaps irreversible drive to “professionalization,” the obligation to perform that ritual now comes very early in a scholar’s career.

And so the implicit content of many a conference paper is not, as one might think, “Here is my research.” Rather, it is: “Here am I, qualified and capable, performing this role, which all of us here share, and none of us want to question too closely. So let’s get it over with, then go out for a drink afterwards.”

With Phyllis and Julius, as with others of their generation and cohort, the ebb and flow of discourse was very different. It is not that they had no Goffmanian interaction rituals, but the rituals were different. The cafeteria had imposed its own style. The frantic pace of defining one’s area of specialization, acquiring the proper credentials, and passing through an obligatory series of disciplinary enactments of competence (aka “conferences”) -- and doing all this, preferably, in one’s 20s -- would have been utterly out of place, over pie.

Instead, the cafeteria fostered a style in which the tone of authority had to be assumed with some care. There was always someone nearby, waiting to ambush you with an unfamiliar fact, a sarcastic characterization of your argument, a book he had just carried over from the library with purpose of shutting you up for good, or at least for the rest of the afternoon. (“Now where is it you say Lenin wrote that? It sure isn’t here!”) You had to think on your feet, to see around the corner of your own argument. And if you were smart, you knew to make a point quickly, cleanly, punching it home. The Jacobsons introduced me to a valuable expression, one that neatly characterizes the opening moves of many an academic text: “throat-clearing noises.”

Now, it’s best not to sentimentalize the cafeteria and its circumstances, at least not too much. In the 1930s and ’40s, smart people didn’t loiter with intent to argue just because they enjoyed the prospect of constituting a “free floating intelligentsia.” They were there for economic reasons. The food was cheap, the jobs were scarce. Academe was nothing like the factor in the nation’s economic life that it is today, and few saw a career there as an option. The hiring of Lionel Trilling in the English department at Columbia in 1932 had provoked concern among the faculty; he was, after all, as someone put it, “a Marxist, a Freudian, and a Jew.” If you had a name like Jacobson, you knew the cards were stacked against you.

Nor was the discursive style of the cafeteria intelligentsia all brilliant rhetorical fireworks and dialectical knife-juggling. I suspect that, after a while, the arguments and positions began to congeal and harden, becoming all too familiar. And the familiar gambit of “you lack the theoretical sophistication to follow my argument” seems to have had its place in cafeteria combat.

One faction in the Jacobsons’ circle insisted that you had to study German philosophy to understand anything at all about Marx’s economics. Fifty years later, P&J still sounded exasperated at the memory. “These kids could barely read,” Phyllis said, “and they’d be lugging Hegel around.”

So maybe a paradise of the unfettered mind it wasn’t. Still, in reading academic blogs over the past couple of years, I’ve often wondered if something like the old style might not be rousing itself from the dustbin of history.

For one thing, important preconditions have reemerged -- namely, the oversupply of intellectual labor relative to adequate employment opportunities. The number of people possessing extremely sophisticated tools in the creation, analysis, and use of knowledge far exceeds the academic channels able to absorb them.

Furthermore, the self-sustaining (indeed, self-reinforcing) regime of scholarly professionalization may be just a little too successful to survive. Any highly developed bureaucracy imposes its own rules and outlook on those who operate within it. But people long subjected to that system are bound to crave release from its strictures.

For every scholar wondering how to make blogging an institutionally accredited form of professional activity, there must be several entertaining the vague hopes that it never will.

The deeper problem, perhaps, is the one summed up very precisely in a note from a friend that arrived just as I finished writing this: “Do you think there’s any way that intellectual life in America could become less lonely?”

I jot these thoughts down, wondering what Phyllis and Julius would make of them -- a question that darkens many an otherwise happy moment, nowadays. One thing seems certain: P&J would want to argue.

“Blogs are nothing like the cafeteria,” they might say. “Well, maybe a little, but not that much. Go ahead though, Scott. Go ahead and chop that kettle.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Free Refills

Any day now, I should get some business cards from the Inside Higher Ed headquarters, announcing to the world -- or at least to anyone who asks for one -- my new position, which is "Essayist at Large."

It is a title with a certain pleasing vagueness of mandate. I feel a bit like Diogenes Teufelsdrockh, the (imaginary) German philosopher portrayed in Thomas Carlyle's satirical book Sartor Resartus, who held the rank of Professor of Things in General.

The plan is for this column to push intellectual generalism as hard as it will go. Intellectual Affairs will be a forum for discussing academic books (old and new) and scholarly journals (ditto). I'll track down dissertations and conference papers that deserve a larger audience, and report on what's happening in the world of think tanks, humanities centers, literary quarterlies, and online e-zines. Nor, of course, will we neglect the terrain known as the blogosphere -- that agonistic realm routinely combining the best qualities of the academic seminar with the worst traits of talk radio.

The sudden shift from "I" to "we" in that last sentence was no accident. I am counting on eagle-eyed readers to point out things meriting attention. The essay form is at its most interesting when it becomes "polyphonic," as the Soviet-era cultural theorist Mikhail Bakhtin put it -- a space in which a number of voices coincide and enter dialogue.

To be sure, this column will provide its share of what people at newspapers sometimes call "thumbsucking." (Journalism, like scholarship, has its own jargon.) As the novelist and critic Wilfred Sheed once defined it, a thumbsucker is an essay "presenting no new information, but merely revealing the beauty of the writer's mind."

Well, ouch. But fair enough. A little of that sort of thing goes a long way, though. So this space will remain as open as possible to the "blooming, buzzing confusion" of contemporary intellectual life. For one thing, there will be interviews with scholars, professors, and/or thoughtful writers. (Those categories overlaps, but are not quite identical.) And your thoughts on what is afoot in your field are also welcome. I promise to read e-mail between trips to the Library of Congress and occasional bouts of navel-gazing.

As Carlyle recounts it, the city fathers of Weissnichtwo invited Teufelsdrockh to join the faculty of their newly opened university because they felt that "in times like ours -- the topsy-turvey early 19th century -- all things are, rapidly or slowly, resolving themselves into Chaos." The interdisciplinary field of Allery-Wissenchaft (the Science of Things in General) might help set the world straight.

Unfortunately, "they had only established the Professorship, [not] endowed it." And so students didn't see much of him -- except at the coffeehouse, where "he sat reading Journals; sometimes contemplatively looking into the clouds of his tobacco-pipe, without other visible employment."

When, at long last, Teufelsdrockh published his great philosophical treatise, it was "a mixture of insight [and] inspiration, with dullness, double-vision, and even utter blindness."

The previous owner of my copy of Sartor Resartus underlined this passage, and scribbled a note in the margin wondering if it might have been a source for the title of Paul de Man's seminal volume of essays on literary theory from 1971, Blindness and Insight, published 140 years after Carlyle's satire appeared.

An interesting conjecture, hereby commended to the attention of experts.

Rereading that passage just now, however, I faced a more pressing question. Will this column provide the right ratio of insight and inspiration to "dullness, double-vision, and even utter blindness?"

Well, ouch again. But then, such are the risks one takes in practicing Allery-Wissenchaft. See you again on Thursday. In the meantime, I'll be at the coffeehouse, either thinking deep thoughts or just staring off into space. Fortunately the refills are free.

Author/s: 
Scott McLemee
Author's email: 
http://mailto:scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Defending Derrida

On Sunday, about 200 people crowded into the Jacob Burns Moot Court of the Cardozo School of Law in New York City to speak of Jacques Derrida -- a.k.a. "Jackie" and "JD" -- at a conference called "Derrida/America: The Present State of America's Europe." The throng was down to a third of that size by Monday morning. Maybe everyone else went off to see "The Gates," Christo's installation of saffron banners running around Central Park. The installation wouldn't last forever, while the job of sorting out the legacy of deconstruction might take a while.

Certainly the dominant note of the event (a gathering "in celebration and mourning," as a few speakers put it) was to insist that Derrida's work deserved more serious notice than it had received in the American press following his death in September. In welcoming the audience, Peter Goodrich, a professor of law at Cardozo, noted that people who were "unimpeded by any knowledge of what they're talking about" evidently felt an especially passionate urge to denounce Derrida. Although no speaker mentioned it as such, the most egregious example was undoubtedly the obituary in The New York Times -- a tour de force of malice and intellectual laziness, by someone whose entire knowledge of Derrida's work appeared to have been gleaned from reading the back of a video box for the Woody Allen film Deconstructing Harry.

But the problem is not simply with the American public at large. "There is something I've wanted to say in public for some time," announced Simon Critchley, a professor of philosophy at New School University. "The treatment of Derrida by philosophers in the Anglophone world was shameful. They weren't stupid. They knew better. They hadn't read Derrida, and they knew they hadn't. But philistinism -- combined with envy at Derrida for being smart, charismatic, good looking, and a snappy dresser -- made them behave in a way that was, there is no other word for it, shameful."

The crowd applauded. "Now I feel better," he said.

Posthumous compliments for Derrida, and cathartic insults for his enemies, were only a small part of the program. Speakers came back repeatedly to "Force of Law: The 'Mystical Foundation of Authority' " -- a lecture on the complex and contradictory relationship between law and justice that Derrida gave in 1989, at a colloquium called "Deconstruction and the Possibility of Justice," held at Cardozo, the law school of Yeshiva University. Derrida's paper is now most readily available now in Acts of Religion, a collection of his writings published by Routledge.

Among the scores of books and essays that Derrida published over the final 15 years of his life, "Force of Law" looms as one of the most important. In 2003, not long before he was diagnosed with pancreatic cancer, Derrida published a book on the possibility of global democracy called Rogues, just released in an English translation from Stanford University Press. Many of its themes were anticipated in the Cardozo lecture, making "Force of Law" almost a prerequisite to understanding Derrida's final book. (Or so I figured out the hard way, a few months ago, by reading Rogues first.)

"What is currently called deconstruction," said Derrida in 1989, "would not at all correspond (though certain people have an interest in spreading this confusion) to a quasi-nihilistic abdication before the ethico-politico-juridical question of justice and before the opposition between just and unjust...."

His goal, in effect, is to point to a notion of justice that would be higher than any given code of laws. Likewise, in other late writings, Derrida seeks to define a notion of forgiveness that would be able to grapple with the unforgivable. And, he asks, might it be the case that Levantine traditions of hospitality (of welcoming the Other into one's home) transcend more modern conceptions of ethics?

For someone constantly accused of relativism, Derrida often sounds in these late works like a man haunted by the absolute. There is a sense in which, although he was an atheist, he practiced what a medieval scholar might have recognized as "negative theology" -- an effort to define the nature of God by cutting away all the misleading conceptions imposed by the limits of human understanding.

The implications were political, at least in some very abstract sense. In his keynote talk at the American Academy of Religion in 2002, Derrida proposed a notion of God that, in effect, utterly capsized the familiar world of monotheism by stripping it of all our usual understandings of divine authority. Suppose God were not the all powerful king of the universe (the image that even an atheist is prone to imagine upon hearing the name "God"). Suppose, rather, that God were infinitely weak, utterly vulnerable. What then? What would it mean that human beings are made in His image?

Such moments in Derrida's work could be very moving. Or they could be very irritating. At the Cardozo conference, it sounded at times as if the jury were still out on "Force of Law." Some speakers indicated that the lecture had radically transformed the way they understood legal theory, while a couple of dissenters suggested that Derrida had at most made a very late contribution to the school known as critical legal studies -- or even served up "warmed over legal realism" with a rich French sauce.

The oddest and most contentious turn in the discussion may have been the remarks of Jack Balkin, a professor of constitutional law at Yale, who, in a sardonic way, implied that there might be a hotbed of deconstructionist legal thought in the Bush administration. He sketched an outline of Derrida's formulation of three "aporias" (that is, unpassable points or double binds) in the relationship between justice and law.

For example, there is the aporia that Derrida calls "singularity." The law consists of general rules, and to be just, those rules must be equally binding on everyone. Yet while it is illegal to kill another person, it would be unjust to impose the same penalty on an assassin and someone defending herself from attack. Thus, justice exceeds even a just law.

Likewise, Derrida pointed to the aporia of "undecidability" --  the law guides the judge's decision, but the judge must decide which particular laws apply in a given case. And there is an aporia of "urgency" -- for while the legal process unfolds in time, "justice," as Derrida put it, "is that which must not wait." In each case, justice requires the law, but exceeds it.

"The Justice Department," said Balkin, "has invoked all three aporias of law" in the "war on terror." He ran through the list quickly: The suspension of due process in some cases (singularity). The government must have the discretion to apply the law as it sees fit, given its knowledge of circumstances (undecidability). And justice demands swift, even immediate action (urgency). "I am afraid that Bush has hoisted Derrida by his own aporias," said Balkin.

Of course this formulation did not go unchallenged by members of the audience during the discussion afterward. But it did call to mind something that Peter Goodrich had said earlier, in recalling Derrida's first visit to Cardozo. "Law school depressed him," as Goodrich put, "both the environment and the inhabitants." Perhaps it was, at best, a distraction from the philosophical pursuit of pure justice, in all its impossible beauty.

On Thursday, Derrida, the university, global democracy, and some flashbacks to the 1980s, when the abyss was just a seminar away.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Academic Freedom, Then and Now

This year marks the 50th anniversary of The Development of Academic Freedom in the United States by Richard Hofstadter and Walter P. Metzger, published by Columbia University Press. It has long been out of print. But circumstances have had the unfortunate effect of making it timely again. Locating a copy is worth the trouble, and once you do, the book proves just about impossible to put down.

For one thing, reading it is a relief from the mingled stridencies of l'affaire Ward Churchill and of David Horowitz's latest stunt, the so-called "Academic Bill of Rights." (By the way, is it just me, or don't their media performances suggest that Churchill and Horowitz are identical twins whom ideology has separated at birth? Both have glint-eyed zealotry down pat.)

At the same time, the book is a reminder of how incredibly prolonged, complicated, and perilous the emergence of academic freedom has been. The book was commissioned in 1951 by the American Academic Freedom Project, which had a panel of advisers from numerous distinguished universities and seminaries (plus one from the Detroit Public Library), and it was published alongside a companion volume, Academic Freedom in Our Time, by the director of the project, R. M. MacIver, an emeritus professor of political philosophy and sociology at Columbia University.

It was, in brief, the closest thing to an official scholarly response to the danger of McCarthyism from the university world. The authors must have finished correcting proofs for the book around the time Joseph McCarthy lost his committee chairmanship and was censured by his colleagues in the Senate. The darkness of the time is particularly evident in MacIver's volume, with its conclusion that "the weight of authority in the United States is now adverse to the principle of intellectual freedom."

Hofstadter and Metzger, by contrast, make only a few direct references to the then-recent challenges to academic freedom. Senator McCarthy's name never appears in the book. Hofstadter traces the history of American academic life up to the Civil War, and Metzger continues it through the early 20th century -- a panoramic survey recounting scores of controversies, firings, and pamphlets wars. But recording only "the outstanding violations of freedom" would mean reducing history to "nothing but the story of academic suppression."

Condensing 500 pages into five paragraphs is a fool's errand, but here goes anyway.

The belief that only the community of scholars has the final authority to determine what counts as valid research or permissible speech has deep roots in the history of the university, going all the way back to its origins in medieval Europe. But it was extremely slow to develop in colonial and antebellum America, which had few institutions of higher learning that were anything but outgrowths of religious denominations.

In 1856, George Templeton Strong suggested to his fellow trustees of what was then Columbia College that the only way to create a great university was "to employ professors of great repute and ability to teach" and "confiding everything, at the outset, to the control of the teachers." It was an anomalous idea -- one that rested, Hofstadter indicates, on the idea that scholarship might confer social prestige to those who practice it.

As the later chapters by Walter Metzger argue, it was only with the rapid increase in endowments (and the growing economic role of scientific research and advanced training) that academics began to have the social status necessary to make strong claims for their own autonomy as professionals.

At least some of what followed sounds curiously familiar. "Between 1890 and 1900," writes Metzger, "the number of college and university teachers in the United States increased by fully 90 percent. Though the academic market continually expanded, a point of saturation, at least in the more attractive university positions, was close to being reached.... Under these competitive conditions, the demand for academic tenure became urgent, and those who urged it became vociferous." It was the academic equivalent of the demand for civil-service examinations in government employment and for rules of seniority in other jobs.

Academic freedom was not so much the goal for the creation of tenure as one of its desirable side effects. The establishment of the American Association of University Professors in 1915 "was the culmination of tendencies toward professorial self-consciousness that had been operating for many decades." And it was the beginning of the codification of rules ensuring at least some degree of security (however often honored only in the breach) for those with unpopular opinions.

Speaking of unpopular opinions, I must admit to feeling some uneasiness in recommending The Development of Academic Freedom in the United States to you.

It is a commonplace today that Richard Hofstadter was a Cold War liberal -- and a certain smug knowingness about the limitations and failures of Cold War liberalism is the birthright of every contemporary academic, of whatever ideological coloration. Furthermore, Hofstadter stands accused of indulging in "the consensus view of history," which sees the American political tradition as endorsing (as one scholar puts it) "the rights of property, the philosophies of economic individualism, [and] the value of competition."

I don't know anything about Walter Metzger, but he seems to share much of Hofstadter's outlook. So it is safe to dismiss their book as a mere happy pill designed to induce the unthinking celebration of the American way of life. No one will think the worse of you for this. Besides, we're all so busy nowadays.

But if you do venture to read The Development of Academic Freedom, you might find its analysis considerably more combative than it might at first appear. Its claim is not that academic freedom is a deeply rooted part of our glorious American heritage of nearly perfect liberty. The whole logic of its argument runs very much to the contrary.

Someone once said that the most impressive thing about Hofstadter's Anti-Intellectualism in American Life (1963) was that he managed to keep it to just one volume. The deepest implication of his work is that academic freedom does not, in fact, have very deep roots even in the history of American higher education -- let alone in the wider culture.

On the final page of The Development of Academic Freedom in the United States, his collaborator writes, "One cannot but be but be appalled at the slender thread by which it hangs.... one cannot but be disheartened by the cowardice and self-deception that frail men use who want to be both safe and free." It is a book worth re-reading now -- not as a celebration, but as a warning.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Substitute Teachers

With college enrollments growing, tuition soaring, and administrators reaching for their chain saws to cut costs, the role of the tenured professor is under fire as never before.

"The days of the royal professorship are, like, so over," proclaims Aventa Clew, chief of human resources at Awed State. "Who can even afford a tenured faculty member when we're outsourcing jobs to Cuba, or wherever I'm thinking of?" The Awed response, anxiously watched by schools across the country, is to create "a more intermediate pedagogy," as Awed Dean of Conservative Arts Iona Bentley put it, "somewhere between instructors and serfs."

Starting next fall, the bulk of incoming students will be taught by a new cadre of professionals called assistant assistants, teachers whose sole responsibility will be in the classroom: no office, no bathroom privileges, and most important, no benefits. A sub-category to assistant assistants may be recruited from the ranks of the new never-graduate students, a guild of craft-persons, particularly medieval studies types in the history department, dedicated to staying within the walls of the academy.

At lower-cost institutions, those who can't quite teach but merely impart information to students will be hired as drones, moonlighting from their regular jobs as greeters at Wal-Mart. The new ranks may take hold soonest in Texas, where the Leave No Teacher Behind initiative has been implemented in a chain of retraining camps.Of course, drones can be prerecorded, an idea that has not gone unnoticed in education departments across the country.

At C.I.T.M.T., the California Institute of Too Much Technology, employing the same animation techniques that made The Polar Express such an enhanced miracle of sound and motion, the computer labs have started to produce virtual professors. The v.p.'s, as they're known in the trade, can perform functions that traditional pedagogues can only dream of: executing a brutal savate kick to emphasize a point about physics, or morphing into Grendel while reciting Beowulf. The newest version, Prof5000, can execute 100 pedagogical decisions a second while also composing an abstract and serving on a committee for academic freedom. In the works are plans to produce virtual students, as well, and miniaturize entire endowed buildings to the dimensions of a computer chip.

When polled, C.I.T.M.T. students said they didn't think it would affect their learning experience. "If I'd wanted a human teacher," scoffs one sophomore engineering student who did in fact ask to be named, "I'd have gone to a community college."

Will the C.I.T.M.T. administration ever be replaced by a machine? "Of course not," said one C.I.T.M.T. official. "Our work is far too important for that. In fact, we're hiring 17 new deans next year."

Author/s: 
David Galef
Author's email: 
dgalef@olemiss.edu

David Galef is a professor of English and administrator of the M.F.A. program in creative writing at the University of Mississippi. His latest book is the short story collection Laugh Track (2002).

This, That and the Other Thing

Intellectual Affairs has been running for just over a month now. It might be a good moment for a bit of housecleaning.

Readers have contacted me about some interesting developments apropos Ayn Rand, Jacques Derrida, and the history of academic freedom -- so today's column will have the element of variety going for it. Consider it a roundup of faits divers. After all, that sounds a lot more sophisticated than "news in brief."

Referring to the followers of Ayn Rand as "Randroids"  was probably not the nicest way to celebrate the 100th anniversary of the author's birth. But it was positively kind by contrast with the really strange honor being paid to her soon by her devotees. They are all set to publish a volume that will document, at great length, how Rand coped with a private, and fairly humiliating, part of her life.

First, a little Objectivist history:

In 1968, the world of Rand's followers -- which included quite a few academics, as well as a young economist by the name of Alan Greenspan -- was shaken by the news of a split between Rand and her most famous disciple, Nathaniel Branden, the psychologist best known for giving the expression "self-esteem" its current inescapable popularity.

There had been a romantic liason between the author of The Fountainhead and the psychologist, who was 25 years younger. When Branden declined to continue the relationship, he and his wife Barbara were read out of the movement.

Many of the details later became available in The Passion of Ayn Rand, a biography by Barbara Branden. And they were confirmed by Branden's memoir, Judgment Day. (Long before their books appeared in the late 1980s, the couple had divorced.) In 1999, the complicated Objectivist menage was dramatized in a steamy (yet also not-so-hot) docudrama for Showtime also called  The Passion of Ayn Rand. A better title might have been Atlas Shagged.

In any case, the story will now be told again in the pages of a new book drawing on Rand's notes. For years after the split, Rand sought to dissect the "psycho-epistemology" of the Brandens -- in short, hundreds of pages of brooding over a failed love affair. The book is authorized by the Ayn Rand Institute, which holds her papers, the official and "orthodox" wing of her Objectivist movement.

Perhaps the most incisive comment on the volume comes from Chris Sciabarra, author of  Ayn Rand: The Russian Radical and other studies. "Reading Rand's personal journal entries makes me feel a bit uneasy," he recently wrote in an online forum.  "As valuable as they are to me from an historical perspective, I suspect there might be an earthquake in Valhalla caused by the spinning of Ayn Rand's body."

The Passion of Ayn Rand's Critics (Durban House Publishing) was originally scheduled to appear in time for the centennial of her birth, in early February. Its appearance has been bumped back. Expect an  earthquake in Valhalla sometime this spring or early summer.

The editing and translation of posthumous works by Jacques Derrida will be a cottage industry. And pity the fool who takes on the job of preparing a definitive bibliography.

It was with a sense of tempting fate that, in a recent column, I described the book now available in English as Rogues as the last book Derrida saw through the press during his lifetime. Anthony Smith, a sharp-eyed undergraduate at DePaul University, points out that a few months later Derrida published a volume with better claim to that distinction.

It is called Béliers: Le dialogue ininterrompu entre deux infinis, le poème. Through a little digging, I've learned that it has been translated as "Rams: Uninterrupted Dialogue -- between Two Infinities, the Poem," and will appear in a forthcoming volume of Derrida's essays on Paul Celan, the great Romanian-Jewish poet (and concentration camp survivor) who wrote in German. (Let me tempt fate again by guessing that the "rams" in Derrida's titles is an allusion to the Shofar).

Derrida first presented Béliers in Heidelberg in February 2003, as a memorial tribute to Hans-Georg Gadamer. The German philosopher, author of Truth and Method, had died the previous year at the age of 102. "Will I be able to testify, in a way that is right and faithful, to my admiration for Hans-Georg Gadamer?" asks Derrida.

Good question! I can't wait to find out. For one thing, it's news to hear that Derrida admired Gadamer. In 1981, when colleagues arranged for them to meet and discuss one another's work at the Goethe Institute in Paris, their exchange left Gadamer feeling (if one may translate freely from a more refined philosophical idiom) "pissed, dissed, and dismissed."

By 1992, Gadamer was still complaining that Derrida was "not capable of dialogue, only monologue." But perhaps that made the eulogy all the more eloquent. After all, Derrida did get to have the last word.

Finally, a correction to the recent column celebrating the anniversary of Richard Hofstadter and Walter Metzger's  The Development of Academic Freedom in the United States, first published in 1955. The work was, I wrote, "long since out of print."

Well, that was at least half right. In 1996, Transaction Publishers reissued the first part of the book as Academic Freedom in the Age of the College, by Richard Hofstadter, with an introduction by Roger L. Geiger, who is distinguished professor of higher education at the Pennsylvania State University.

In his introduction, Geiger pays tribute to Hofstadter's gifts as both a historian and a writer, while pointing to some elements of his research that haven't held up too well. For example, Hofstadter overestimated how many colleges founded before the Civil War ended up failing. The problem was that his data set included institutions that never opened, or just served as secondary schools.

Until the late 19th century, very few American institutions of higher learning bore much resemblance to the contemporary research university. By e-mail, I asked Geiger how much contemporary relevance Hofstadter's study might have.

The project, Geiger wrote back, "was commissioned and written with the conviction that it was very relevant to contemporary America, circa 1955. Hofstadter, in particular, seemed to equate the ante-bellum evangelical colleges with the kind of xenophobic populism that seemed to support McCarthyism."

Actually, my question was about our contemporary situation, not Hofstadter's. But sometimes you have to wonder if that's too fine a distinction.

And a plea to you, dear reader. Please drop a line if you hear an interesting conference paper, or read an impressive (or, for that matter, atrocious) article in a scholarly journal. Should there be some earth-shattering, or at least profession-rocking, discussion taking place on a listserv, please consider passing that news along.

That's not really a call for gossip -- though, of course, if you have any, I'm all ears (as Ross Perot once put it, in a different context).

The simple fact is that the audience for Inside Higher Ed is self-selecting for intelligence. And you aren't reading this site because you have to, but because you want to. So you've got good taste as well. The really interesting developments in scholarly life tend to occur well below the radar of the academic presses, let alone the administration. I'd rather hear from one graduate student or adjunct with a finger on the pulse of her discipline than go to dinner with a provost who has an expense account and no clue.

Well, that probably sounded ruder than it should have. But you get the idea. We're not standing on ceremony here. This column is run "cafeteria style," or in the spirit of the coffee houses of Vienna from a century ago. If I'm missing something important, please don't hesitate to say as so.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Derrida's Wake

When last weekend's conference at Cardozo Law was first announced, the title was given simply as "Derrida/America." Only while standing in the lobby did I learn the subtitle, "The Present State of America's Europe," from the official brochure containing the final schedule. The original title had been nagging away at my memory for several days. It conjured up echoes from (roughly) the first Reagan administration -- the time when, as a somewhat pushy adolescent culture vulture, I began auditing graduate courses in English and comp lit.

It wasn't just a punctuation mark, used oddly. (Your sense of appropriate punctuation was the first thing to change, in theory boot camp.) That slash between "Derrida" and "America" was a borderline .... an unreadable signifier of connection and separation ... marking the difference yet also erasing it .... And so on. We learned to prize such moments, when the text deconstructed itself.

At least in his work from the late 1960s and early '70s, Derrida had referred to the "hinge" between terms in a metaphysical opposition -- between, say, appearance and essence. By the time it was assimilated into American literary study, the philosophical nuances had been cut back in the interests of classroom exposition. You'd track down some plausible equivalent of a metaphysical opposition in a literary text. In a pinch, the distinction between literal and metaphorical meaning would do. Then you'd find the border or hinge between them. (I can't quite describe how this was done; it's like riding a bike, you just know when you've done it.)

Taking that "hinge" firmly in hand, you would flip the opposition, unleashing complexities aplenty. And then -- whoa! The implications spiraled out of control. It could leave you breathless. (I have some dim recollection of writing a paper on Hawthorne's preface to The Scarlet Letter that left it sounding slightly more experimental than Finnegans Wake.) Afterwards, it sometimes felt as if you had successfully overthrown the entire history of Western thought from Plato to NATO, even if your knowledge of that history were not so profound.

After a while, though, the entire enterprise began to prove all too predictable. It felt like a technique for building your very own abyss from a prepackaged kit, manufactured in New Haven. Sometime around the start of the second Reagan administration, I found better things to do -- for example, studying the thinkers Derrida himself had been reading, but also getting arrested in protests against American foreign policy. (The latter seemed more urgent and worrisome than Hawthorne's aesthetic ideology.) Today, no journal publishes the sort of deconstructive literary analyses that the Yale critics once produced. It is hard to imagine why anyone would, except as an exercise in nostalgia.

As it happens, Derrida himself became somewhat put out with the initial reception (and domestication) of his work by literature departments. As early as 1980, he referred to deconstruction as "a word that I have never liked, and whose fortune has disagreeably surprised me." He insisted that his work had consequences not only for the reading of literary or philosophical texts, but for understanding and changing institutions -- in particular, scholarly institutions.

Anyone curious about the implications of deconstructive thought for academic administration might take a look at Derrida's lectures and memos in Eyes of the University, published last year by Stanford University Press. "I believe," he announces, "in the indestructability of the ordered procedures of legitimation, of the production of titles and diplomas, and of the authorization of competence." (I do believe some conservatives owe Derrida an apology.)

Derrida's effort to push his thinking beyond the university -- and past the boundary lines of contemporary politics -- reached its end in a book called Voyous, the last major work to appear in his lifetime. It has just been published in translation as Rogues, also from Stanford. Simplifying somewhat, you could call Rogues a book about democratic globalization. Or rather, about what Derrida calls "the democracy to come" -- a notion both infinitely hopeful and endlessly problematic.

Certainly there is more to it than a faith that democracy will steadily spread across the globe, deepening and strengthening itself as it goes. Derrida was never interested in futurology. And if he is a prophet, it is only in the most ironic of religious senses. In speaking of "democracy to come," he was posing a subtle but powerful question -- asking, in effect, "What will democracy have meant, when we can begin to think about it, one day, in a democratic world?"

The problem, first of all, is that the philosophical tradition is by no means an abundant source of concepts for thinking about democracy. Down the ages, it was often understood in nightmarish terms. A democracy would be a state run by the lowest denominator. At best, "rule by the people" is conceived as a high ideal. "We do not yet know what we have inherited," writes Derrida. "We are the legatees of this Greek word and of what it assigns to us, enjoins us, bequeaths or leaves us." Yet, he also writes "we ourselves do not know the meaning of this legacy." We make haste to pass the notion of democracy on, without looking too closely as its demands.

The root difficulty, according to Derrida, is that we cannot think about democracy without dragging in another concept, sovereignty. "These two principles," he writes, "are at the same time, but also by turns, inseparable from one another."

Why is that a problem?

Well, the concept of sovereignty (that is, authority and domination over a discrete territory) has survived from the era of monarchy. Under democracy, "the people" replace the king as sovereign. But the structure remains at least potentially authoritarian. For one thing, defining "the people" is anything but a semantic issue: Even a multiethnic democratic state can be gripped by the passions of xenophobic exclusion.

At the same time, the very notion of sovereignty implies the use of force. The borders of a sovereign state are ultimately backed up by the power to wage war in their defense.

The internal contradictions create what Derrida calls political "autoimmunity" -- the tendency of sovereign power to turn on democratic rights, in the name of democratic principles.

These tendencies go into overdrive with the emergence of even the most rudimentary forms of an international democratic order. Derrida looks at the role played by the concept of the "rogue state" since the fall of the Soviet Union. Regimes have been so designated, almost always by the United States, on the grounds of "supposed failings with regard to either the spirit or the letter of international law, a law that claims to be fundamentally democratic." While calling on the United Nations to respond to such regimes, the United States has been willing to ignore international law and agreemends endorsed by most countries in the world.

Derrida does not hesitate to call the U.S. one of "the most perverse, the most violent, the most destructive of rogue states." He is also pretty harsh on the United Nations Security Council. And as if all that were not bad enough, a new species of political agent has appeared on the world stage: the transnational network, making no bid to establish traditional forms of sovereignty, yet possessing (or seeking) the power to kill on scale equivalent to that of any state, "rogue" or otherwise.

A book of questions, then, and not of answers. Derrida was swift to open parenthetical arguments, nestling them one inside the other -- and ending them, far too often, with an ellipse.... Rogues does not feel like his last word on anything; rather, it seems to have been the opening stage of a project that remains unfinished.

A few people cited Rogues during the gathering at Cardozo, but usually in passing. It will take time to assimilate. And for that matter, there will be more from Derrida. Besides thousands of pages of unpublished seminars, there are stray texts, such as a chapter that he added to the manuscript of his book The Gift of Death after it appeared in English ten years ago. Adam Kotsko, a graduate student at the Chicago Theological Seminary, is now completing a translation of the chapter and writing an essay on it.

Kotsko wasn't at the Cardozo event, but a few days beforehand did attend a symposium on Derrida at Northwestern. He took time out to tell me, by e-mail, about the material he's translating. It moves, he says, between Kierkegaard's Fear and Trembling, the Book of Genesis, and Kafka's "Letter to His Father." "Derrida argues that pardon and literature are intrinsically linked and that the modern Western institution of literature has Abrahamic roots," Kotsko told me. "He concludes by connecting both literature and pardon to the democracy to come."

Reading that, I felt a little bit like Jean Hippolyte, who was Derrida's first thesis advisor (a task interrupted by his death in 1968). After a conference at Johns Hopkins in 1966 where Derrida first presented his work to an American audience, Hippolyte told him, "I really do not see where you are going."

But then, the younger philosopher had a perfectly good reply. "If I clearly saw ahead of me where I was going," Derrida said, "I really don't believe that I would take another step to get there."

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Keeping It Real

What did Jacques Lacan mean by "the Real"? I found out, sort of, by walking across my apartment in search of a copy of the recent re-translation of his Ecrits -- a volume replacing another (somewhat notoriously unreliable) translation released by the same publisher more than 20 years earlier.

When a manufacturer of toasters finds out that its toasters are defective, it will issue a recall. About halfway to the bookshelf, the light bulb went off: Time for a class action suit!

Suddenly, a rogue housecat interposed himself between my feet -- causing immediate "walk failure" and consequent wrenching of lower back.

Now, the Imaginary is for Lacan the dimension of the human human psyche that permits us to feel more or less cohesive. It is the raw material of ego identity. By contrast, the Symbolic includes all the systems we use for communication and exchange with others. It is "language," very broadly defined. But what about Lacan's third term?

Just to back up a little.... I'd been reading Slavoj Zizek, the wild and woolly cultural theorist, who is about as Lacanian as they come. He slings the lingo like a pro. But every so often, my reading comprehension disappears, like the steam from a bowl of cooling soup.

Zizek refers to the Real "escaping" the Imaginary and "errupting into" the Symbolic. Which is good to know, but not that helpful. It left me wondering: "OK, the Real -- what is it? And where?"

And then, out of nowhere, I got an answer. The Real is a silent but (potentially) deadly housecat. The realm of the ego's Imaginary dignity is violated. The order of the Symbolic is reduced to groans and obscenities. The Real is what leaves you on the floor.

Fredric Jameson, the lefty lit-crit guru maximus, once equated Lacan's concept with the Marxist notion of History -- a word that Jameson always capitalizes, like the name of a god. History, and hence the Real, he explained, "is what hurts."

OK, but does that mean my cat embodies History? (I've just founded a new school of thought. Either that, or the pain killers are finally kicking in.)

Zizek is known for illuminating Lacan's work with examples from daily life and popular culture. But Astra Taylor, who is now putting the finishing touches on a documentary on Zizek, figured that the film would work better if some of those illustrations were themselves illustrated. So the exposition will include animated sequences -- in short, brief psychoanalytic cartoons.

People who have spent time puzzling over Lacan's quasi-mathematical diagrams can only greet this news with both curiosity and the sense that, after seeing the film, they are probably going to have some really weird dreams.

In any case, Zizek: The Movie will premier at the Roxie Cinema in San Francisco on April 21, with the subject of the film himself in attendance. And the filmmaker is preparing to tour college campuses with the documentary this spring, with screenings now scheduled for Emory University, the University of Georgia, the University of California at Berkeley, and the University of Illinois at Urbana-Champaign.

Taylor is still putting her travel plans together, so anyone interested in arranging a campus showing should contact her. For more information on the film itself, check out its Web site. Zizek: The Movie goes into general release this fall.

Also on the world-premier front..... Revolution Books, the largest chain of Maoist bookstores in the United States (not that they have had any competition in quite a while) is holding parties to celebrate the publication of From Ike to Mao and Beyond, a memoir by Bob Avakian, whose full and rather awesome title is Chairman of the Central Committee of the Revolutionary Communist Party, USA.

The book sports blurbs by Cornel West (who says that Avakian's "voice and witness are indispensible") and Howard Zinn (who calls the memoir "a humanizing portrait of someone who is often seen only as a hard-line revolutionary"). The reader learns of the Maoist leader's love of doo-wop music, his passion for basketball, and his skill in the kitchen as a maker of waffles.

There is much to disagree with in the book. Avakian, for examples, refers to Stalin's "errors." It is hard to think of his lethal purges as some kind of epistemological blunder. The difference between "committing mistakes" and "committing atrocities" is not just semantic.

And yet the memoir itself is -- ideology aside -- incredibly interesting. The author is the son of a federal judge (now deceased) in the San Francisco Bay area. The book paints a fascinating picture of Berkeley during the 50's and 60's. The campus upsurge of the Free Speech Movement is just the start of a long march, with stops in China (during the Cultural Revolution), Chicago (where "Chairman Bob" becomes the maximum leader of a small party), and Paris (to which he relocates around the time Reagan comes into office).

Suffice it to say that the author will not be attending any book parties or news shows. I asked a representative of the publisher, Insight Press. She indicated that preserving the security of the Chairman is a high priority, while an appearance on Good Morning America is not.

Meanwhile, another volume by Avakian is due this month from Open Court, an academic publisher in Chicago. Marxism and the Call of the Future: Conversations on Ethics, History, and Politics is a collaboration with Bill Martin, a professor of philosophy at DePaul University. Portions of it are available online herehere, and here.

At one point, they note that the slogan "Serve the People," made famous by the little red book, could be used -- with very different intentions, of course -- at a McDonald's training institute. This is, on reflection, something like Hegel's critique of the formalism of Kant's ethics. Only, you know, different.

A footnote to history: In an article a couple of years ago, Avakian recalled taking a course on Paradise Lost when he was a student in the honors program at Berkeley. The professor teaching that course was one Stanley Fish.

Proof that higher education in America is in the hands of wild-eyed radicals? Is Fish's academic empire-building just a way to create a Shining Path to postmodern communism? And what about this "John Milton" character? Is it just a coincidence that the leader of America's Maoists once studied the poetry of a man who was the minister of propaganda for a revolutionary movement (the Puritans) that seized state power and executed the rightful king?

I report, you decide.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Publishing
Back to Top