When I was just a few years out of graduate school I wrote a “treatment” for a television series to be called “The Young Professors.” The show tracked the adventures of three new assistant professors as they negotiated the ins and outs of life at Soybean State College, a medium-sized, teaching-intensive public institution somewhere in the Midwest.
This was in the mid-1970s, when series about young doctors and lawyers were big. So, knowing that nothing succeeds in commercial TV like a knock-off, and hoping for a source of income other than teaching summer school, I took a crack at putting college on the small screen. While I billed the show as “The Halls of Ivy” meets “The Mod Squad,” my project had no legs. My plots were not exactly ripped from the headlines, and although any resemblance between the characters that I depicted and actual persons living or dead was purely coincidental, or so I claimed, those characters weren’t going to keep viewers from changing the channel. A small flurry of interest from a local public broadcaster led nowhere, and now the yellowing, typed pages of “The Young Professors” sit in a folder with the rest of my juvenilia.
My efforts to “write what you know” notwithstanding, the classroom remains an occasional backdrop for television, and while the successful shows from the medium’s golden age, like “Our Miss Brooks,” “Mr. Peepers,” and “Room 222,” portray a strikingly unrealistic version of the high school experience, occasionally we’re treated to a media distortion of college life as well.
To be fair, we may pretend that the media tells it like it is, but we know very well that even reality television is far from real. Cop shows are notorious for misrepresenting life on the streets, and lawyer series fail to capture the highly-nuanced world of torts and contracts. Life in the E.R. is not all high concept relieved by short commercial breaks. And Ralph Kramden was no ordinary bus driver. But college professors don't even get the kind of attention we lavish on Cosmo Kramer or Archie Bunker. TV hasn't brought us the Dead Deconstructionists' Society, or anything that looks at college from a faculty perspective. Woody Allen occasionally reduces Brandeis to a cultural stereotype in his movies, and there was a popular TV series about life of a pre-med at the fictional University of New York, but don’t hold your breath for a dramatization of Harvard MBA’s-in-training, not to mention the Sarah Lawrence experience.
When it does notice higher education, television can treat it as comic. For example, there’s Ross, the museum paleontologist who once taught an evening class on “Friends,” and who does wind up on the NYU faculty . But more often than not, TV prefers a lurid look at campus life. Hudson University, a recurring venue on “Law and Order” and its spin-offs, is more a source of L&O’s killers and corpses than it is a provider of expert witnesses to testify in court.
The hijinks at Hudson are a far cry from swallowing goldfish or stuffing frat boys into a Volkswagen. At Hudson’s labs, researchers are killed by animal rights activists when they’re not too busy subjecting students to trials of dangerous experimental drugs without informed consent. Hudson undergrads regularly lose roommates to murder, and grad students occasionally kill their advisors, or are killed by them, often after having sex without informed consent.
In fact, many of L&O’s higher ed plots revolve around “Sex and the City University.” In one episode a grisly murder leads the police to a Hudson anthropologist who’s desperately trying to hide from his wife his unhealthy appetite for young boys. In another, the president of Hudson is bludgeoned to death by a serial murderer from Australia masquerading as an English don. Instead of acknowledging her misdeeds and copping to “man 2,” the fake Englit specialist must face nonrenewal of her contract and the disappointment of her lesbian lover, who also happens to be her dean.
In contrast to the gritty reality of the L&O classroom cum crime scene, the 1950s half-hour sitcom, “The Halls of Ivy,” presents a bucolic Ivy College that is nothing like Hudson University. “The Halls of Ivy” starred Ronald Coleman and his wife Benita Hume as William Todhunter Hall, the genial, urbane president of Ivy College, in the town of Ivy, somewhere in the Midwest, and his wife Victoria Cromwell Hall, the college president’s equally genial and urbane wife. After a successful radio run from 1949-1952, “The Halls of Ivy,” following the earlier lead of “Our Miss Brooks,” migrated to television in 1954.
“The Halls of Ivy” had great promise and strong backing (it was one of the most expensive TV series of its day). But while Eve Arden’s portrayal of Constance Brooks, everyone’s favorite high school English teacher, captivated viewers for four years, America wasn’t ready for a show about college foibles, and “The Halls of Ivy” lasted only one season. Although created by one of the lead writers for the populist radio series “Fibber McGee and Molly,” the plots on ”The Halls of Ivy” were too high-brow for the television audience. Indeed, many of the 38 episodes that CBS aired (seasons were longer then) could have been ripped from the headlines, provided they were the headlines of Inside Higher Ed. One TV history says the show flopped because it was too literate and lacked action.
On the other hand, the fact that "The Halls of Ivy" drew a national audience at all was itself a cultural phenom. America was undergoing one of its most intellectually deadly moments at the time, with universities the target of rabid red baiters. On top of that, “The Halls of Ivy” dealt with issues that were surely sensitive in 1954 and are still pressing and controversial in the academy today: racism (in one episode, a Chinese student is ostracized by classmates and runs away); athletics vs. education (in another, the cross-country star quits the team because track is taking too much time away from his studies; in a third, a top pre-med student wants to give up dreams of the O.R. up to become a professional boxer). The show even dealt indirectly with gender stereotyping: while Vicky often plays the role of ditzy sidekick to her husband’s competent-administrator pose, a minute later she’ll put on her "Murphy Brown" persona and trade literate barbs with Toddy like any self-assured and hip chronicler of human foibles.
To be fair, whole episodes of “The Halls of Ivy” were devoted to issues over which the American public both then and now might be expected to yawn: the eroding faculty/student ratio; professors who aren’t publishing; a candidate for a named chair who might be a fraud; a department in danger of being closed because of low enrollments. There was even a half hour devoted not to drug trafficking, a problem that’s endemic at L&O’s Hudson University, but to traffic congestion on Ivy’s no-longer-sleepy campus, and Dr. Hall’s attempts to dodge a ticket he got from the college police. I doubt if even the readers of Inside Higher Ed would have patience to sit through those segments today, compelling as they seemed to the show’s producers at the time.
But “The Halls of Ivy” did deal unashamedly with the facts of college life. While Ivy’s president is in no danger of being bludgeoned to death, in the pilot episode of “The Halls of Ivy,” President Hall nervously waits while the board of Ivy College votes on renewing his contract. Hall does get the nod, but a short three episodes later we see that the president is still insecure: Toddy and Vicky hurriedly throw together a meal without letting on that their guest -- the fussbudget chairman of the governing board -- has come to dinner one night too early. In other episodes, Hall sweet-talks an eccentric donor who demands that the college display one of her sculptures in exchange for a new gymnasium, and he must find a way to avert a mandatory faculty retirement that will be disastrous for the college. In two episodes Hall deals with the problems of what we now call returning or nontraditional students, but in the blunt language of the 50s were simply old folks going back to school. Later in the season he intervenes to quell rumors that the new Latin professor is a sexual predator. And in another segment, Hall deftly confronts the problem of an honor student who is about to be expelled because she never finished high school. Toward the end of the show’s run, Hall puts on kid gloves to handle a gangster come to campus to find out why his nephew was kicked off the football team.
It wasn’t the limited-interest plots or the controversial issues that kept “The Halls of Ivy” audience coming back week after week. It was the finely-tuned scripts and the ensemble acting, and the show’s theme song, a Whiffenpoof-style chorale that managed to enjoy some commercial success independent of the series. But while Dr. Hall had no trouble convincing the board to renew his contract, it soon became clear that Ronald Coleman had done far, far better things than portray a college president, and the audience eventually dwindled to a point where it was too small to warrant CBS picking up the show’s option for a second season. As the redeemed Latin prof might have put it, “De gustibus non disputandum est.”
I was 10 when “The Halls of Ivy” aired, and though I knew nothing of academia I looked forward to the show each week. It still seems to me that academic life should generate at least as much public interest as infomercials for food processors or exercise equipment, but perhaps it was best that “The Halls of Ivy” bowed out gracefully. And while it was nice for me to dream that the “The Young Professors” would one day generate as many spin-offs as “Law and Order,” which seems to be on one cable channel or another every hour of the day, I know that it’s best that my show never got off the ground. Even public-access cable channels aren’t ready for a series focusing on tenure, struggles over who gets the nice office, or endless committee meetings. “The Young Professors” would have been, in effect, a show about nothing, and as such it was no doubt far ahead of its time.
While I don’t expect television to portray college faculty with docudrama accuracy, I still believe there’s a role for higher education on TV beyond “Law and Order,” “The College Bowl”-style quiz, the “Book World” interview, or the CNN talking head. For some reason, the movies are more likely to get it right, with films that show professors as just like everybody else, only more so, like "The Blue Angel," "Good Will Hunting" and "A Beautiful Mind." Most academics are not pretentious boors or stuffed shirts like the one in “Annie Hall” who expounds vacuously on Marshall McLuhan’s theories while standing in a movie line, prompting an exasperated Woody Allen to bring out the real Marshall McLuhan to chide him. McLuhan was also an academic, by the way, though clearly not much of an actor. And few of my colleagues have the get up and go of Indiana Jones, a movie professor who can turn into a villain-bashing superhero just by taking off his glasses and putting on a pith helmet.
Instead of reducing us all to cultural stereotypes, it would be nice to see shows in which professors contribute to the solution rather than the cause of crimes, have sex lives which are dramatically compelling without being criminally dysfunctional, or participate in witty sitcoms like “The Halls of Ivy,” or what I hoped that the “The Young Professors” might become. While there’s no “Law and Order” network, at least not yet, there are entire networks devoted to animal antics, do-it-yourself projects, and city council meetings. Surely a show about professors could find a niche on one of the 500-cable channels while being both too literate and lacking action.
Dennis Baron is a professor of English and linguistics at the University of Illinois at Urbana-Champaign.
"We only wake you up for the important meetings." --N.Y. Yankee co-workers to George, on an episode of "Seinfeld"
In a recent New Yorker cartoon, a group of people is seated together at one end of a table with upraised hands. The caption reads: "It's unanimous: effective immediately, we spread out around the table."
One of the things that has always most fascinated me about meetings is the agreement that must be already in place before the meeting takes place. Surely not, whatever else, including the arrangement of seating itself! And yet another of the things that has always fascinated me about meetings is that absolutely nothing can be taken for granted about them. Not even seating, as confirmed by meetings that begin -- much like classes -- with everyone present bid to either spread out or form themselves in a circle.
Who does the bidding? Not only the chair. Indeed, one could make a case that academic meetings are distinctive either because authority is regularly delegated (in departments, to the heads of other committees) or else always open to decentralized procedures of various kinds (often the establishment of sub-meetings). To whom is the bidding to be seated made? Not only to departments -- just to continue with this organizational "unit." Or rather not only to departments whose membership is fixed; for many years I was part of a department that fudged the question of whether the secretary could attend meetings and fumbled the question of whether adjuncts were part of the department by requiring them to leave before voting on anything began.
What about the meeting's agenda? Surely at least these are agreed upon? In theory, agreement is secured by publishing or circulating an agenda beforehand. In practice, though, consideration of anything during a particular meeting is often not limited to the agenda. Just as often, the meeting really heats up when something additional is either added or something unforeseen erupts.
I still recall my very first department meeting. I don't remember whether it had an agenda. I do remember the moment when a senior member jumped up from his seat and began cursing the chair. The subject wasn't some new disciplinary perspective. (I had assumed this was what departmental meetings were about.) It was about a private quarrel between the two men, involving the fact that a student had fired a gun into the living room window of one of them.
Later I found out that the senior man was a retired CIA agent. The person who told me this was himself a former CIA agent. What? How could I find myself in a department, two of whose members were CIA? I thought this was the sort of circumstance that happened in academic novels! These were the same novels, of course, in which meetings were mentioned, but not described.
If a department is not reducible to its meetings, are its meetings reducible to the department -- or is the department, in turn, reducible to its members? For many years in my own former department, I used to feel that we would have better meetings if we had a better department, and we would not have a different department until we had different members. In time, we did. But the members were arguably worse. However, the department meetings were occasionally better.
Now I'm not sure what to think about such meetings, except that when all is said and done, on the part of just about any group, meetings are inherently boring, forever driven by a few people who like to hold forth about curriculum planning or the latest Vision Statement from the administration. Everyone else -- especially the untenured -- feigns polite interest, unless something of personal consequence appears. If it doesn't appear, well, there is always the next meeting.
Once I knew a woman new to American academic life who professed herself stunned at the sheer tedium of so many meetings during which so little was accomplished. One day she was near tears. "Most of what's discussed is completely superfluous!" I blurted out in response: "Don't forget: the purpose of the meeting is to have another meeting." It was suddenly as if somebody else had uttered these words. Maybe somebody else once did to me -- after a meeting.
After a meeting: ah, this is a golden time, when frank talk can ensue with intimates about what really happened, how predictable it was that so-and-so said such-and-such, and whether -- given the administration, the chair, the union, the alignment of the planets -- the final vote would ultimately mean anything. Meanwhile, too bad there had to be a meeting at all, that exquisitely formal affair in which much was considered and little decided.
I once had a colleague who told of a friend who had counseled him thus: the best way to endure meetings was to smoke a pipe. People saw the pipe, not you. For better or worse, these days are now gone. We who must continue to meet today have fewer weapons at our disposal to do battle against the inevitable fatigue. Idle scribbling on a print-out of the agenda or the last minutes: Is this the promised end?
Of course I appear too cynical. Some issues of course demand meetings. Just don't ask me to give examples. Some meetings prove to be absolutely necessary. Blame me if it seems these particular ones are usually the most boring. Lastly, we must agree at least that a department simply cannot conduct itself without meetings. Yet is there no better, more efficient way for it to do its business?
I've heard of departments that try to do so exclusively through e-mail. This might work, especially in excessively factionalized departments. But then the department deprives itself of a chance to be visibly recreated as a collective whole. Such deprivation is not accomplished without peril. Another way to put the issue: the purpose of meetings is to have a department.
Members may teach alone. They usually research alone and they certainly write alone. But each belongs to a department (and through it, to an institution). Meetings are crucial in assuring members of their own common cause, ranging from curricular change to tenure votes.
We can bemoan meetings. We can't easily give them up. Consider the situation of adjuncts. Most departments are virtually forced to dream up occasions for adjuncts to meet, under the auspices of "professional development" or institution-specific "strategies."
Here the purpose of the adjunct-only meeting is not so much to have another meeting. (Many in attendance could be gone by next semester.) The purpose is to have the meeting (and therefore a "department" of sorts) in the first place! Are adjuncts thereby constituted as a group? Of course not. Not only do such matters of high moment as curricular change fail to concern them.
Adjuncts are excluded from even such lowly questions as the selection of new textbooks. Indeed, consolidating ideals of any sort -- apart from the scandal of there being adjuncts at all -- are not available to them; adjuncts are paid to teach, not to attend meetings about teaching -- or anything else. And yet, there must be meetings for them to be "encouraged" to attend, lest their professionalism itself be endangered. Of course, once they do, just once, another meeting is theoretically possible, and then all seems well.
No matter, somewhat paradoxically, that freedom from meetings, in fact, is the usual virtue of their lot regularly invoked by adjuncts themselves! Everyone is expected to smile knowingly. (Unless full-timers suspect sour grapes. ) Nobody, it seems, is expected actually to like meetings. Just so, though, all are expected to acknowledge their abiding necessity, therefore to attend the next meeting.
In sum, one cheer for meetings. Readers will recognize my allusion to E.M. Forster's famous essay, "What I Believe," wherein he gives democracy a grudging two cheers. One is because it admits variety. The other is because it permits criticism. The departments of my experience admit variety, but far more grudgingly than in Forster's democracy. Worse, they permit little real criticism. Nothing is harder at a meeting than to raise some fundamental objection to an item or an issue, and then expect to have it thoroughly treated.
Forster's democratic model is Parliament, whose deliberations, I suspect, would put most academic departments to shame. Not only because Parliament abides the individual "nuisance" intent on exposing some abuse. Not only because Parliament is virtually mandated to "chatter and talk." But also because Parliament's "chatter," claims Forster, is "widely reported." In comparison, a department's deliberations are of course impeccably -- not to say, preciously -- private.
One cheer for meetings seems to me quite enough. There had better be one because, academically, we're all in it together, and we somehow manage to remain so (unless we're adjuncts) even through our mostly dreary, ill-starred meetings. Also, one cheer gestures at the existence of more departments than an individual can easily imagine, where variety actually speaks on a regular basis (even without tenure!) o where criticism remains an animating voice. Meetings, finally, are just one of those fateful things about academic life that most of us have to tolerate, when all is said and done (though preferably not at another meeting), like non-committal deans, rude office staff, and students who won't turn off their cell phones. Meetings we will always have with us. But please God, not next week, and not too late in the afternoon.
Terry Caesar's last column was about college presidents.
It's only October, but already you can feel the nip of holiday commercialism in the air. That's especially true at the big chain stores for cultural goods, where the public-domain Dickens books and the discount CDs of Bing Crosby are now on display, priming the pump for seasonal cheer.
And making your way to the checkout counter, you might notice a new title positioned for maximum impulse-buying convenience: A small book called Festivus: The Holiday for the Rest of Us, by Allen Salkin, published by Warner Books.
Late last year, Salkin wrote an article for The New York Times about how some people now celebrate the "Seinfeld"-spawned faux tradition. More precisely, they (or rather, we) invite friends over to Festivus gatherings in early December -- in lieu of the regular Christmas, Hanukkah, or Kwanza parties. In the course of his reporting, Salkin learned about the Festivus party my wife and I have held in early December for some years now. He gave me a ring to discuss it.
Evidently this interview took place not long after I had downed a large cup of strong coffee -- for I distinctly recall doing a prolonged riff on how Festivus was a postmodern variant of the British social historian Eric Hobsbawm's concept of "invented tradition." This is an exercise known as "bullshitting." You could read a book about it.
None of my improvisation, alas, ended up in Salkin's article. (Nor was there any reference to my effort to add to the Festivus traditions by making the song "Now I Wanna Be Your Dog" by Iggy and the Stooges into a carol.)
Anyway, a couple of months after the piece ran, Salkin was back in touch. He had just gotten a contract to do a book on Festivus, and wondered if I might write up certain aspects of my rant for inclusion as a short essay.
Well, the book is now out. And the essay is in there ... but now in a form much abbreviated. The reference to Eric Hobsbawm, for example, has been removed. (A grievous omission, though it's possible that the great man would prefer it that way.) Some degree of cutting is to be expected. But what did come as a surprise was, rather, the addendum: A sarcastic little item running alongside the piece, scoring easy points off its "overintellectualization" of the holiday. (As though that were not a tendency the essay itself is mocking.)
It seems, in short, like a very curious way to repay someone who contributed his work for free. Then again, free-floating rancor was always the dominant tone on "Seinfeld."
In any case, I've retained rights to the essay, and am running the full text of it here, in the hope that this version be considered definitive by scholars in the field of Festivus studies. If any...
Each year, my wife and I invite friends to gather around the aluminum pole -- or at least the place it would be, if we ever got around to buying one -- and discuss the True Meaning of Festivus. Of course it's gotten so commercialized now. But Festivus is here to stay. After long cogitation (too long, probably) I've concluded that there is more to it than an excuse for non-religious seasonal holiday. Festivus is the postmodern "invented tradition" par excellence.
Admittedly, the phrase "postmodern 'invented tradition' " is something of a mouthful, but there is a more or less serious historical argument behind it. Let's see if I can make it with a straight face.
Once upon a time -- let's call this "the premodern era" and not get too picky about dates -- people lived in what we now think of as "traditional societies." Imagine being in a village where few people are literate, everybody knows your name, and not many people leave. A place with tradition, and plenty of it, right?
Well, yes and no. There are holidays and rituals and whatnot. As spring draws near, everybody thinks, "Time for the big party where we all eat and drink a lot and pretend for a few days not to notice each other humping like bunnies." (That one was a big hit even before New Orleans was on the map.) And yet people don't say, "We do X because it is our tradition." You do X because everybody else around here does it -- and as far as you know, they always have. Not doing it would be weird, almost unimaginable.
But then, starting maybe 300 years ago, things got modern. We tend to imagine that profound cultural dislocations (from war, industrialization, the global marketplace, yadda yadda yadda) only kicked in within recent decades. That's just because our attention spans are so short. Well before Queen Victoria planted her starchy skirt upon the throne, people were nostalgic for the old days.
And so, according to the British historian Eric Hobsbawm, they started inventing traditions from bits and pieces of the past. In the 19th century, for example, folks started singing "traditional Christmas carols" -- even though, for a couple of hundred years, they had celebrated the holiday with pretty much the same hymns they sang in church the rest of the year.
In short, if you say, "We do X because it's traditional," that is actually a pretty good sign that you are modern. It means you have enjoyed (and/or endured) a certain amount of progress. What you are really saying, in effect, is, "We ought to do X, even though we sort of don't actually have to." There is a world you have lost. Tradition is a way of imagining what it must have been like.
Postmodernism is what happens after you've been modern so long that "being modern" doesn't seem all that special -- but at the same time, you don't feel like "being traditional" is all it's cracked up to be, either. And you start putting things in quotation marks all the time.
Does that sound familiar? I could cite a bunch of stuff here about "the decline of metanarratives" and "the simulacrum." But if you're a "Seinfeld" fan, you've had a pretty good taste of pomo without the theory.
What makes Festivus a postmodern invented tradition is that it comes straight out of the mass media, without any moorings in a vague sense of reviving something lost or forgotten. Nobody ever felt a yearning to celebrate it. Frank Costanza just makes the holiday up, and all the "traditions" that go with it. It's hyper-individualistic -- the perfect holiday for the culture of narcissism. The beauty of the Festivuscelebration is that it lays bare all the stuff that you have to squelch just to get through the holiday season.
We gather with family at Christmas or Hannakuh in order to recapture the toasty warmth of community and family. And because, well, we have to. So you'd best bite your tongue.
During Festivus, by contrast, all the vague hostility of enforced togetherness gets an outlet. You have a chance to air your grievances -- and to pin the head of the household to the floor, if you can. It's hard to get sentimental about an aluminum pole. But as long as there are midwinter holidays, the spirit of Festivus will fill the air.
Campus Sexpot is the title of a moderately sleazy potboiler from 1961 -- a tale of faculty-student relations at a small-town junior college where, it seems, the graduates of Peyton Place High School continued their educations. The author was one Dale Koby. A search of online bookdealers reveals that he went on to a fairly prolific career as pulp author, turning out such memorable titles as Sex by Appointment, Lust on Wheels and Perverted Wife. Koby also edited some not-quite-scholarly editions of classic (or at least old) erotica. But most of his ouevre is missing from the catalog of the Library of Congress. It lists him only as the author of A Teacher Confesses to Sex in the Classroom, a work of nonfiction from 1965 revisiting certain themes from his first novel.
Koby was a terrible writer. (Sample: "She thrust her breasts up at him with a pert sauciness.") But by 1962, he had an attentive readership in the California mountain town of Sonora, where he had worked, for about three semesters, as a high-school teacher. David Carkeet, who was for many years director of the MFA program at the University of Missouri in St. Louis, was a student at the school at the time. As he recalls in Campus Sexpot: A Memoir, published by the University of Georgia Press, it was not hard to figure out the real-life identities of Koby’s characters.
Chances are, Carkeet would have studied the novel closely in any case, even apart from the interesting questions it raised about the relationship between life and art. He was 15 when the book appeared, and glad for whatever information on sex he could find -- such as that information was, in a book that carefully avoiding descriptions of everything below the waist. "For all the genital detail we’re going to get," notes Carkeet about one of Koby’s characters, "Linda might as well be a mermaid."
Carkeet is the winner of the award for creative nonfiction from the Association of Writers and Writing Programs. That information is announced on the cover, just above a high-school newspaper photo of the author standing on a chair, kissing a much taller girl under the mistletoe. Any writer looking back at adolescence must, of course, face the complications of embarrassment. (It is not just one part of the memories, but part of the writer’s present toolkit: There is a skill involved in handling embarrassment, in using it to carve a shape out of the past.) Inspired by the insight that the trash that once fascinated us gives the quickest access to the identities we've shed, Carkeet uses Campus Sexpot as a way to excavate memories otherwise too disobliging to recall.
"When I read the book now," he says, "its verbal avoidance of body parts with which I am actually familiar returns them to a thrilling condition of mystery. I don’t have to make an effort to enter this frame of mind. Instead, the words in Campus Sexpot that lead up to a saucy scene fire ancient neurons, and before I know it, I am transported into a state of salacious ignorance."
Any work of pulp-era smut consists of two sorts of writing. There are "the good parts," which the reader revisits until they become very familiar, and the rest, which is just barely tolerable the first time. "The chief device for advancing the story," writes Carkeet, "is not action but constant banal dialogue; the reader of a Koby novel longs to enter it not in order to have sex but in order to tell everyone to shut up."
Umberto Eco once made a similar distinction regarding the semiotics of pre-video porn films -- which were, as he put it, "full of people who climb into cars and drive for miles and miles, couples who waste incredible amounts of time signing in at hotel desks, gentlemen who spend many minutes in elevators before reaching their rooms.... To put it simply, crudely, in porn movies, before you can see a healthy screw, you have to put up with a documentary that could be sponsored by the Traffic Bureau."
Carkeet reproduces the "good bits" from the original Campus Sexpot in bold. This is not just a typographical device, or a convenience to the reader, but an index of how much they had burned themselves into his memory. One passage in particular seems like a key to understanding the effect of the novel on him. (It is also a good example of Koby’s prose at its most fine-honed.) In it, a professor named Paul Skell comments on a student, Linda Franklin, who is the titular campus sexpot:
"Hips made for the act of love," Paul muttered, "and ideally designed to accommodate a pair of hot pants. If she's a virgin, I'll donate half of my salary this year to a home for wayward girls. I've spent my life being interested in girls with hot pants. I've studied them from every angle. I believe I know all the symptoms, and Linda Franklin has them."
This came as a revelation. The original of "Paul Skell" -- easily recognizable from his description to those who attended his school -- was the dull and high-minded pedant who taught Carkeet’s freshman English class, and a leader in the local DeMolay assembly. For those not in the know, the DeMolay order is the male youth auxiliary of the Freemasons. It provides "a regimen of enforced dignity for boys at an undignified age," as Carkeet writes, "and the primary engine of uplift is a vast body of ornate ritual that reads like the Boy Scout oath as revised and expanded by Samuel Johnson." To imagine that a severe and proper adult might have "spent [his] long career" studying hot pants "from every angle" was a decisive moment in Carkeet’s sentimental education.
My hope, as a reader, was that Carkeet would track down the author of Campus Sexpot and find out what he was doing now. Koby wrote and edited for the pulp-porn industry up through the late 1960s, and Carkeet tracks down some of these subsequent efforts. (" Appointment by Sex treats a phenomenon I was unaware of when I was growing up -- supermarket cashiers doubling as lesbian prostitutes who meet the needs of shopping housewives neglected by their husbands.") But his creative output declined after 1968, for reasons that are anybody’s guess; and he died sometime in the 1980s.
Carkeet finds that, before arriving in Sonoroa, Koby had been a college instructor in San Jose, and also had affairs with two students at another high school. "There is no reason to doubt the report of A Teacher Confesses to Sex in the Classroom," writes Carkeet, "for it is hardly a self-aggrandizing story.... The author portrays himself as predatory and manipulative but shows no more contrition than one finds in the Roger Miller song of the period, 'Dang Me.' He plays mind games with his young charges, his favorite being exaggerated devotion even as the affair is ending, just to see what reaction he can get."
Insofar as the original Campus Sexpot may be said to have had a plot, its denouement occurs at the courthouse, where Koby has his characters gather for a final melodramatic reckoning. And in real life, too, Sonora had a courthouse, where Carkeet’s father served as the town’s judge. The final chapter of Campus Sexpot: A Memoir is a portrait of the author’s old man -- a recognition of his failings, but also a tribute to him as someone with the moral center that the pulp novelist lacked.
It's not for readers to determine the justice of that conclusion, of course: We know only as much as Carkeet tells us. But as an ending, it certainly follows from the book's effort to work out the parallels and the divergences between fact and fiction.
It also flows from the memoiristic logic of deriving insight from embarrassment. The 15 year-old reader of the novel grew up to be a novelist and a writing teacher himself. No doubt he had a somewhat romanticized version of Koby at the back of his mind -- remembering him as freewheeling beatnik living outside respectable society, and so on. How humiliating to discover that, all along, the more complex and interesting figure may have been your real father, not the surrogate.
A recent entry at his blog Unlocked Wordhord, Richard Scott Nokes, an assistant professor of English at Troy University, recalls how he and some friends let off steam in graduate school a few years ago by making up an imaginary theorist, Pierre Mourier, to discuss on a lit-student listserv. (My thanks to Ralph Luker for pointing this item out.) Nokes says that a few people on the list who weren’t in on the joke began to pontificate on Mourier’s work –- even correcting the title of a translation of one of his papers.
It’s a good story. An edifying one, even: a cautionary tale about the danger of craving the au courant, even at the cost of making yourself ridiculous. But if you go to the archives of the departmental listserv in question, a slightly different picture emerges. Searching "Mourier," you find no messages by unwary poseurs dropping Mourier’s name. One or two puzzled souls do confess that they’ve never heard of the author of Murmurs in the Cabaret: Finding Language through Noise (1951). Everybody else, however, is plainly goofing.
On a more sober note, we should perhaps consider the case of Henri Mensonge, that oft-neglected Franco-Bulgarian genius. He can most accurately (if also most confusingly) be labelled proto-post-structuralist. Mensonge is no mere online ghost. His work was the subject of a compact book by the late Malcolm Bradbury.
The Library of Congress has assigned a call number to My Strange Quest for Mensonge: Structuralism’s Hidden Hero (Penguin, 1987) that places it on the same shelf as Bradbury’s comic novels about British university life. But the Dewey system treats it as a work of philosophy. (I came across it in a public library, by chance, while looking for something about Jacques Maritain.) The confusion is exemplary. I suspect that Mensonge, and certainly Bradbury, would be pleased.
Bradbury notes that a translation of Mensonge’s treatise La Fornication comme acte culturel (which had originally appeared in either 1965 or '66) would be forthcoming "from the West Coast Marxist-Feminist Gay Collective Press, under the title Sex and Culture, with a lovely cover, in their ‘His and Her-Meneutics’ series."
Sadly, this edition never appeared. The Anglophone reader has no choice but to consult Bradbury’s volume. It comes with an afterward by Michel Tardieu, the professor of structural narratology at the Sorbonne best known for his work reducing the plots of both Pride and Prejudice and War and Peace to the same quadratic equation. A note indicates that Tardieu’s essay was translated by Bradbury’s close friend David Lodge. (Readers of Lodge's novel Small World may recall that Tardieu makes a brief cameo.)
Little is known about Mensonge himself. Early in his career, he served as a teaching assistant to Roland Barthes, but certain passages in Mensonge’s work suggest that it must have been a difficult relationship. In 1968, Barthes published his famous essay announcing, as its title had it, "The Death of the Author." But by that point, Mensonge had already anticipated Barthes’s argument and carried it one step further -- erasing nearly all traces of his own existence, to a degree that other reclusive writers might envy. We have, for example, a portrait of Thomas Pynchon from his high school yearbook, while the only surviving photograph of Henri Mensonge shows the back of his head. As he once put it: "I have sought a level of absence that is so complete it cannot be mistaken for anything other than it is."
He even sought to keep his name off the spines of books by or about him. In this age of the theorist as "academostar," such avoidance of celebrity is refreshing. And yet Mensonge is the man who, in Bradbury’s words "out-Barthesed Barthes, out-Foucaulted Foucault, out Derridaed Derrida, and out-Deleuzed-and-Guattaried Deleuze and Guattari." He opened (in Mensonge’s own words) "a new field of desacralizing inquiry." His influence (if that is the word for it) rests upon a single major work, though scholars writing in The Mensonge Newsletter have identified a number of anonymous texts that he may have published, including several unusually thoughtful restaurant reviews.
La Fornication comme acte culturel appeared in advance of Jacques Derrida’s first wave of publications (the three cornerstone works of 1967). But he had the misfortune to publish it in Luxembourg, rather than Paris -- and with an undistinguished publishing house that, as Bradbury mentions, "subsequently proved to be a very ineffective cover for the international drug trade." The book was printed on "porous paper of a kind conventionally used for purposes quite other than literary and philosophical dissemination." Copies of the first edition vary from 39 to 115 pages, reflecting a certain lack of attention on the part of the binder.
Nor was it well distributed. "Perhaps the title was misleading," suggests Bradbury. "Certainly it ended up in the kind of bookstore specializing in erotica and in genital technology of the more complicated kind." Even so, the book’s radical argument found a warm reception. Walking past any given café, one heard French intellectuals enraptured by "a constant intense discussion of La Fornication."
Bradbury writes about La Fornication in much the same way Francis Fukuyama might discuss the Home Shopping Network. Mensonge had written "the last book, the book that completes and concludes the shelf of modern thought."
He began, of course, with Saussure’s model of the sign. Then Mensonge pushed the arbitrary nature of the link between signifier and signified harder than the structuralists ever had. The absolute distance between them (between signifier and signified, that is, not between the structuralists; but them too, probably) made communication impossible.
This was not theoretical recklessness on Mensonge’s part, for the implications clearly troubled him. Either "we have everything to say, and nothing to say it with," as he wrote, "or alternatively the opposite. Most of my philosophical contemporaries choose the one or the other, but quite frankly this looks like trouble to me."
But he refused half measures. Mensonge went further -- challenging what he called "the coital cogito," that implicit structure underlying "the non-existent ego of the scholar him/her non-self." For is it not commonplace to refer to intellectual "excitement," to speak of the need to "lay bare the thing as it is" -- so that thought might "penetrate" reality? (As Mensonge sums it up with admirable lucidity: "Alas, the 'thing' as it is in actuality is not, for there is no thing, and actually no actuality for it actually not to be not in.") Nor is it sufficient to deconstruct what Mensonge called Lacan’s "great phallusy." One must also question "the metaphysical vagina."
All of this, mind you, in work published around the time Judith Butler was a kid watching "Batman."
As it happens, Mensonge’s book appeared just as Susan Sontag was publishing her famous essay "Against Interpretation," with its memorable closing line: "In place of a hermeneutics we need an erotics of art." (I noticed this coincidence, not Bradbury. Please footnote accordingly.) And Sontag’s aphorism was itself a tribute to Mensonge’s former mentor, Roland Barthes, who would later go on to write S/Z and The Pleasure of the Text.
It is remarkable how effectively Mensonge distanced himself from B/S, even before they had articulated their own ideas. Clearly he had thought things through with a certain obsessional rigor. "Sex is difficult enough in bed," he wrote, "as my philosophic contemporaries should know. To try to perform it in the bookcase is hubristic beyond belief. In any case it is no use pretending we are at a whorehouse when we know we are at a funeral. Try the book any way you like. It will show no sign of enjoying it, and will certainly not give a squeak back."
The occasional reference to Henri Mensonge now appears, tucked away in the bibliographies of those literary scholars who practice annotation with tongue, as it were, in chic. And the index that Bradbury prepared for the book (embedding a few additional jokes in his cross-references) is cited in the library-science literature. But My Strange Quest remains the one study of Mensonge available – hence, as such, definitive. (If the author neglects to mention that the theorist’s last name means "lie" in French, that seems a trivial oversight.)
Bradbury notes that other scholars tried to dissuade him from writing about La Fornication -- "a work of such profound intellectual subtlety, linguistic density, and textual disorder that there is no way even of translating it, never mind understanding it, and that only a person of the most limited imagination, and probably the most unmitigated stupidity as well, would even dream of undertaking the task," he writes.
"Fortunately for the common reader," he adds, "there are just one or two of us who possess exactly those qualifications and are prepared to use them. Or maybe I am too modest. Just casting one quick eye around the worlds of journalism and scholarship, I realize there could be hundreds."
If intelligent design gets taught in the college classroom, here are some other propositions we can look forward to:
Was Shakespeare the author of all those plays? Competing theories suggest that the Earl of Oxford, Francis Bacon, or even Queen Elizabeth herself penned those immortal lines. You be the judge. Henceforth, the prefaces to all those editions by “William Shakespeare” should be rewritten to give equal time to the alternate-authorship idea.
Does oxygen actually support that flickering candle flame, or is an invisible, weightless substance called phlogiston at work? First suggested by J. J. Becher near the end of the 17th century, the existence of phlogiston was eventually pooh-poohed by supporters of the oxygen hypothesis, but, as they say in the legal profession, the jury’s still out on this one.
Drop a candy bar on the sidewalk, and come back to find ants swarming all over it. Or put a piece of rotten meat in a cup and later find maggots in it, having come out of nowhere! This is called spontaneous generation. Biologists eventually decided that airborne spores, like little men from parachutes, wafted onto the food and set up shop there, but does that make any sense to you?
In the morning, the sun rises over the tree line, and by noon it’s directly overhead. At night, as the popular song has it, “I hate to see that evening sun go down.” Then why do so many people think that the earth moves instead of the sun? Could this be a grand conspiracy coincident with the rise of that Italian renegade Galileo, four centuries ago? Go out and look at the sunset! As they say, seeing is believing.
Proper grammar, the correct way of speaking, the expository essay model -- how rigid and prescriptive! There are as many ways to talk as there are people on this good, green earth, and language is a living organism. Or like jazz, an endless symphony of improvisation. No speech is wrong, just different, and anyone who says otherwise is just showing an ugly bias that supports white hegemony.
“History is bunk,” declared the famous industrialist and great American Henry Ford. All those names and dates -- why learn any of that when not even the so-called experts can agree on exactly what happened? Besides, most of those historical figures are dead by now, so what’s the point? From now on, all history departments must issue disclaimers, and anything presented as a narrative will be taught in the creative writing program.
Speaking of which, creative writing itself has long been controlled by a bunch of poets and fiction writers who determine who wins what in the world of letters. But who really knows whether the latest Nobel Prize winner is any better than, say, that last Tom Clancy novel you read. It all boils down to a matter of taste, doesn’t it?
Or what about that "Shakespeare"? Was he/she/it really any better than the Farrelly brothers? Let’s all take a vote on this, okay?
David Galef is a professor of English and administrator of the M.F.A. program in creative writing at the University of Mississippi. His latest book is the short story collection Laugh Track (2002).
I had told him about it, but it wasn’t until I’d been called for an interview that my non-academic boyfriend started to get nervous. I drove myself home from the airport and left messages on his answering machine that night, the next day and the day after that. When he called me three days later, it sounded as if he was calling from miles away. By the time I had put the phone down, he was on his way over to pick up the few things he’d left at my apartment. After I cried, I lay in bed that night, hands and feet unfeeling, staring at the ceiling. I guess I’d known that interviewing out-of-state would put pressure on us; what I didn’t know was that it would immediately end the relationship. Six months of dating was just not enough time to build a relationship that we could both hold on to. I didn’t land a full-time position until 18-months late. In that time, I refused to date anyone.
I simply could not put another kind, interesting, funny man through this horrible process. In the end I landed in the Midwest, with only my dog for company. Although I immediately made friends on-campus and off, I found it difficult to consider dating. First, I was not in a tenure-track position. In my mind’s eye, this meant the same process as before. Three years on contract with this university, then moving on. Why bother starting up something that might end up in heartbreak? Yet close girlfriends here and in my original home state urged me to “get in the game” again -- if only to keep from hiding out. I finally did allow myself a few experiences.
I’ve been on a coffee date with an adjunct in my department. Although we are both in the humanities, our similarities end there. A six-year age difference made me feel ancient. And his constant reference to an ex-girlfriend who wasn’t really an ex- made me wary. Disinterested, I didn’t follow up his phone calls, but e-mailed short notes that bordered on professional instead. He has since drifted back into his muddled long-distance relationship -- although I hear that he recently asked our department secretary about other single women at the university.
Urged by my local lady friends, I went on a movie and dinner date with a man who drives trucks for the garbage company. Nervous, I dressed up too much and felt out of place in the movie theater in hose, a dark skirt and sweater. We chatted about nothing special that night -- a nice thing for a woman who’d been out of circulation for some time, but I could not find much to hold on to. He talked about the Navy and his route; I talked about classes and my family. After long pauses and awkward moments, I had that dreaded moment about halfway through the evening where I wished I’d been at home watching television with my dog. This man’s deep interest in marriage and my transient status didn’t help. By the end of the night, I stepped from his Pontiac feeling a bit sad. On the phone the next day, I got honest and told him that I didn’t think we had enough in common. When pressed, I said that I’d also feel guilty keeping him from his quest for a wife. Later he told friends in common that he agreed it was the best thing to do; he didn’t see that much in me. I smiled and nodded my head. He was absolutely right.
Academics frequently think they’re “all that” as my students like to say. And that sense of entitlement gets us into all sorts of trouble. Many of us, including me, are self-centered. That makes a true peer relationship difficult. If a professor also needs ego-feeding, there will be trouble in their partnership outside the office.
"It’s as if he wanted me to applaud for him every night when he came home," confessed my colleague’s ex-wife. "Believe me, I was impressed by his dissertation, his presentations, his research, his papers -- even his thoughts -- but at some point I had to ask myself, ‘What happened to me?’” She is now dating a corporate executive in the area. "It’s just so much easier," she told me over a latté, “I finally feel like I count for something.” Others I’ve interviewed have confessed that professors have a way of making them feel like “mere mortals” rather than peers. And many of these non-academics have more than one college degree, a vast life experience, and vivacious personalities. Although not shrinking violets, they simply could not make a place with a professional who either were tremendously accomplished -- or had an inflated view of his or her worth.
It seems as if relationships between academics and corporate-types have some hurdles to overcome -- yet a number of my faculty-buddies swear by them. “When I finish my job, I want to leave work at work,” says one business instructor I know. When he was married to another instructor, they talked incessantly about their jobs. A year after their relationship crashed, he confessed that he was only interested in dating “non-academics.” He felt relieved that he could start building a life outside of academia. “Don’t get me wrong,” he told me, “I love my job. I just want to stop thinking about it at some point.” He is currently dating a woman who owns a small business.
An accomplished Ph.D. in English rhetoric married his longtime girlfriend who used to wait tables. “She’s real-life educated,” he told me. Her life experience and intellectual curiosity count for a lot. When he comes home to chat about Deleuze and Espinoza, she holds her own -- and quotes the Dalai Lama, which enriches the conversation. My professor friend has a standing commitment to dedicate Sunday to their relationship (and to her two children of a previous marriage) -- and he keeps late-night grading to a minimum. Although they technically have a “trailing non-academic spouse” type marriage, it feels like a peer relationship to both.
A woman friend of mine who teaches humanities at a community college believes that her non-teaching husband brings something unique to their relationship. Because he is in administration in an academic setting, he understands the general issues. He’s also mastered the art of knowing -- truly knowing -- his wife. When she straggles in from a long, frustrating department meeting with a heavy bag of papers, he often says, "You look stressed. Is there anything I can do?" On other occasions, he trots off to the kitchen to make dinner for them both without comment. Some days, when she gets home sooner than he does, she sets in on the household chores, knowing that he will be tired when he gets home. According to her, they have a match made in heaven.
Another advantage is that non-academics have more regular hours -- which may encourage an academic to adopt a more normal working schedule. Many of my friends, tenured and adjunct, have confessed that knowing their significant other is going to be home in three hours forces them to manage their time more wisely. And a non-academic love often encourages academics to make friends outside of the ivory tower -- which can be a nice balance to a bookish, research-dominated life.
For some, however, this match has problems. A tenure-track professor I met told me she hated dating outside of academia -- if only because she did not feel valued. “I dated a municipal court judge who pitied me the whole time. Even though I was presenting at conferences, lecturing, and publishing, he simply couldn’t understand how someone would work for so little money.” Fighting a feeling of “less-than,” she finally stopped dating him. She simply got tired of defending her career.
“He thinks that when I’m presenting at a conference, I’m vacationing,” a colleague confided. Her husband, a contractor, resented her university-funded travel; this difference of opinion brought much tension to the relationship. She also told me that he does not understand her at-home work. “Oh, I forgot. You’re not working today,” is his comment, with requests to pick up his dry cleaning and grocery shop. The time between semesters becomes a battle as he pressures her to make repairs on their classic Victorian house while she is desperately trying to read new textbooks, rework syllabi, course outlines, and assignments -- all while writing to publish. Unless they have owned their own small business, non-academics may not understand the idea of “working” while at home. And the resulting tension can be devastating to a relationship. This is not the only place where academics and their non-academic spouses do not agree. Making money (or not) and how one defines “success” are big concerns.
A liberal arts professor I know dated a man who worked as a marketing manager with a large, successful printing company in the area. When she complained about having papers to grade, he simply answered, “Why don’t you get a job where you don’t have to do all that scut work?” As she sat there, stunned, a handful of student work in her lap, he continued, “Hell, you’d make more money in advertising or something like that anyway.” Not only did she feel unsupported, but she also sensed that he did not understand that she did not teach for money -- or because she had no other skills. When interviewed, she told me that she chose this field because she wanted to live the values she’d been “spouting for a decade.” After studying Buddhism and considering “right livelihood,” she decided she wanted to work at something that contributed to (rather than breaking down) society. And a sense of being able to give back (rather than take) helped her through some non tenure-track years. For successful non-academics, status may be measured by a bank account -- which frustrates academics. The couple’s value system is simply mismatched -- and it is only with the greatest amount of effort that difference may be bridged.
But opinion about academic and non-academic spouses seems to be split squarely down the middle. I have colleagues past and current who swear by their academic loves. A strong bond often develops among professors -- to some it makes sense to seek a partner who suffers and celebrates the same issues. For most it is not just the idea of “summers off,” but a deeper match when it comes to the rhythm of the academic lifestyle. The demands of the job, combined with research and papers, can be daunting. And having a significant other who really understands can help pave the way to a couple’s success. Academic partners also seem more focused on career -- and often have similar interests when it comes to politics and social lives.
“My first husband never wanted to go out to the theater or to the symphony. And I suppose it could be coincidence, but my second husband [an academic] not only loves those things, but also encourages me to see independent films, visit the local art museum and go to poetry readings.” My friend, a foreign-language instructor, is grateful for a companion on these visits. And although a non-academic spouse could have these interests, it is sometimes more likely that an academic spouse will have them. Academics are big readers, too. Those who read books, papers and publications in their own industry often also read for enjoyment -- or simply to broaden their horizons. Not only can this be a source of inspiration and conversation, but also indicates an interest in things outside of one’s experience.
Understanding and helping manage the pressures of academic become easier when you’re already “in the soup” with a love partner. A history professor I know confessed that even though his wife’s Ph.D. was in another area, she was the perfect partner when it came to timing, workload and hours. “She is able to read my needs just by looking at my face and the stack of papers on my desk,” he told me, “It’s such a relief not to have to explain over and over again why I have to take three hours after dinner to draft an outline for a chapter of my dissertation. She’s already been there.” The academic spouse not only understands at a deeper level, but can provide support in a way that non-academics can’t. Two humanities professors I know are co-authoring a paper; they are husband and wife. One confided that this ability to combine their brainpower in this way makes their relationship “that much more complete.”
Although reading one another’s paper or dissertation does not seem like a common event (or even expected), the support is there. One poet I know often runs his work through his wife before he talks to his editor; although her specialty is social work, she often catches small inconsistencies -- and, even better, she really understands his body of work and how that reflects the man. Having a spouse or loved one at a conference or workshop not only can be a bonding experience, but can also lead to discussions that may result in a much-needed lesson for class, or a paper to be presented at a later conference. With academic couples, the sounding board is already there -- and as a friend of mine likes to say, “up to speed.” In some cases, a comparable level of education can provide a foundation for a successful relationship. Yet there may be tensions. The ABD may feel that their Ph.D. toting spouse is a constant reminder of what they have yet to accomplish. And finding jobs that allow a couple to stay together is a near-impossible task.
A new colleague took a position with our university four weeks before the semester started. His wife, on contract to teach at a campus 2,000 miles away, is now desperately trying to land a position in the same area. My colleague told me that they had been apart for three months -- with another seven to go -- if they’re lucky. Or it may be another academic year before they’ll be able to live together again. “We call every night -- but it’s not the same,” he said, “I love her.” But his voice is wistful and he seems confused. I sense that he feels isolated. Although he has cultivated some acquaintances in his new town, he doe not feel as though his experience is complete without his life partner. Single women academics often don’t feel comfortable socializing with a man who is dedicated to a “ghost-wife,” and he often feels like a third-wheel at parties where academic couples meet. The long-distance academic marriage is often an awkward union at best. At its worst, the situation will literally kill the marriage.
One instructor friend who specializes in distance learning says that personality, priorities, values and ability to communicate are the deal-breakers -- not what one does for a living. I think that she is right. Hasty judgments about who makes the best husband or wife can’t be made. Just as there are some absolute clods in academia, there are some wonderfully accomplished, smart and interesting people working for government or private industry. With friends in and outside of academia, I feel as though I am taking advantage of all that the world has to offer. Cutting one group out seems overly focused and elitist. And in our nation, which seems to value entrepreneurialism and individualism at all costs, narrowing the field of human contact seems unwise to me.
Shari Wilson, who writes Nomad Scholar under a pseudonym, explores life off the tenure track.
Inspired less by Phillip Seymour Hoffman’s impressive turn in Capote than by the condition of my checking account, I have been considering the idea of turning out a true-crime book -- a lurid potboiler, but one fortified with a little cultural history, albeit of an eccentric kind. The idea has been stewing for a couple of months now. My working title is BTK and the Beatnik. Here’s the pitch.
In late August,The New York Times ran a short profile of Colin Wilson, the last surviving member of the Angry Young Men -- a literary group that emerged in Britain during the mid-1950s at just about the time Allen Ginsberg and Jack Kerouac were interrupting the presumed consensus of Eisenhower-era America. Kingsley Amis, Philip Larkin, and John Osborne as the Angries’ most prominent novelist, poet, and playwright, respectively. And Colin Wilson -- whose quasi-existentialist credo The Outsider appeared in May 1956, when he was 24 -- was taken up by the press as the Angry thinker.
"The Outsider," wrote Wilson, "is a man who cannot live in the comfortable, insulated world of the bourgeoisie, accepting what he sees and touches as reality. He sees too deep and too much, and what he sees is essentially chaos." He developed this theme through a series of commentaries on literary and philosophical texts, all handled with a certain vigorous confidence that impressed the reviewers very much.
As, indeed, did Wilson’s personal story, which was a publicist’s dream come true. Born to a working-class family, he had quit school at 16, taken odd jobs while reading his way through modern European literature, and camped out in a public park to save on rent as he wrote and studied at the library of the British Museum. Wilson was not shy about proclaiming his own genius. For several months, the media lent him a bullhorn to do it. He posed for photographs with his sleeping bag, and otherwise complied with his fans' desire that he be something like a cross between Albert Camus and James Dean.
The backlash was not long in coming. It started with his second book, Religion and the Rebel, which got savage notices when it appeared in 1957, despite being virtually indistinguishable from The Outsider in subject and method. “We are tired of the boy Colin,” as one literary journalist is supposed to have said at the time.
Roundly abused, though no wit abashed, he kept on writing, and has published dozens of novels and works of nonfiction over the intervening decades. The Outsider has never gone out of print in English; it developed a solid following in Arabic translation, and appeared a few years ago in Chinese.
The piece in The Times came in the wake of his recent autobiography, Dreaming to Some Purpose -- a book revealing that Wilson is still very confident of his own place as perhaps the greatest writer of our time. This is a minority opinion. The reviews his memoir got in the British press were savage. So far it has not received much attention in the United States. Wilson has a cult following here; and the few scholarly monographs on his work tend to have that cult-following feel.
Perhaps the most forceful claim for his importance was made by Joyce Carol Oates, who provided an introduction to the American edition of his science-fiction novel The Philosopher’s Stone (1969). Oates hailed Wilson for "consciously attempting to imagine a new image for man ... freed of ambiguity, irony, and the self-conscious narrowness of the imagination we have inherited from 19th century Romanticism."
Her praise seems to me a bit overstated. But I have a certain fondness for that novel, having discovered it during Christmas break while in high school. It set me off on a fascination with Wilson's work that seems, with hindsight, perfectly understandable. Adolescence is a good time to read The Outsider. For that matter, Wilson himself was barely out of it when he wrote the book. Although now a septuagenarian, the author now displays the keen egomania of someone a quarter that age.
Now, just as The Times was running its profile of the last Angry Young Man, the sentencing hearing for Dennis Rader, the confessed BTK killer, was underway in Kansas. News accounts mentioned, usually in passing, his claims that the striking of sadistic murders he committed over the years were the result of something he called "Factor X." He did not elaborate on the nature of Factor X, though reporters often did often note that the killer saw himself as demonically possessed. (He also referred to having been dropped on his head as a child, which may have been one of Rader’s cold-blooded little jokes.)
But in a television interview, Rader indicated that Factor X, while mysterious, was also something in his control. "I used it," he said.
A jolting remark -- at least to anyone familiar with Colin Wilson's work. Over the years, Wilson has developed a whole battery of concepts (or at least of neologisms) to spell out his hunch that the Outsider has access to levels of consciousness not available to more conformist souls. Something he dubbed "Faculty X" has long been central to Wilson’s improvised psychological theories, as well as to his fiction. (The Philosopher’s Stone, which Oates liked so much, is all about Faculty X.)
As Wilson describes it, Faculty X is the opposite of the normal, dulled state of consciousness. It is our potential to grasp, with an incandescent brilliance and intensity of focus, the actuality of the world, including the reality of other times and places. "Our preconceptions, our fixed ideas about ourselves," as Wilson puts, "means that we remain unaware of this power." We trudge along, not engaging the full power of our mental abilities.
Most of us have had similar insights, often while recovering from a bad cold. But the contrast between mental states hit Wilson like a bolt of lightening. In his recent memoir, he writes, "The basic aim of human evolution, I decided, is to achieve Faculty X."
A few artists are able to summon Faculty X at will. But so, in rather less creative form, do psychopathic killers. For that is the stranger side of Colin Wilson’s work -- the part overlooked by The Times, for example, which repeated the standard Wilsonian claim that he was a philosopher of optimism.
Cheerful as that may sound, a very large part of his work over the years has consisted of books about serial murders. They, too, are Outsiders -- in revolt against "a decadent, frivolous society" that gives them no outlet for the development of Faculty X. Such an individual "feels no compunction in committing acts of violence," as Wilson explains, "because he feels only contempt for society and its values."
These quotations are from his book Order of Assassins: The Psychology of Murder (1972), but they might have been drawn from any of dozens of other titles. Beginning with the Jack the Ripper-sque character Austin Nunne in his first novel, Ritual in the Dark (1960), Wilson has populated his fiction with an array of what can only be called existentialist serial killers.
In these novels, the narrator is usually an alienated intellectual who sounds ... well, quite a bit like Colin Wilson does in his nonfiction books. The narrator will spend a few hundred pages tracking down a panty-fetishist sex killer, or something of the kind -- often developing a strong sense of identification with, or at least respect for, the murderer. There may be a moment when he recoils from the senselessness of the crimes. But there is never (oddly enough) any really credible expression of sympathy for the victims.
The tendency to see the artist and the criminal as figures dwelling outside the norms of society is an old one, of course; and we are now about 180 years downstream from Thomas De Quincy’s essay "On Murder Considered as One of the Fine Arts." But there is something particularly cold and morbid about Wilson's treatment of the theme. It calls to mind the comment film critic Anthony Lane made about Quentin Tarantino: "He knows everything about violence and nothing about suffering." It comes as little surprise to learn that a girlfriend from Wilson's bohemian days recoiled from one of his earliest efforts at fiction: “She later admitted,” he writes, “that it made her wonder if I was on the verge of becoming a homicidal maniac.”
So did BTK have Wilson’s philosophical ruminations in mind when he was “using” Factor X to commit a string of sadistic murders? Did he see himself as an Outsider – a tormented genius, expressing his contempt for (in Wilson’s phrase) “the comfortable, insulated world” of modernity?
Circumstantial evidence indicates that it is a lead worth pursuing. We know that that Rader studied criminal justice administration at Wichita State University, receiving his B.A. in 1979. Wilson’s brand of pop-philosophizing on murder as a form of revolt (a manifestation of “man’s striving to become a god”) is certainly the kind of thing an adventurous professor might have used to stimulate class discussion.
And it would be extremely surprising to find that Rader never read Wilson’s work. Given the relatively small market for books devoted entirely to existential musings, Wilson has produced an incredible volume of true-crime writing over the years – beginning with his Encyclopedia of Murder (1961) and continuing through dozens of compilations of serial-killer lore, many available in the United States as rather trashy paperbacks.
The earliest messages Rader sent to police in the mid-1970s reveal disappointment at not getting sufficient press coverage. He even coined the nickname BTK to speed things along. Clearly this was someone with a degree of status anxiety about his role in serial-killing history. One imagines him turning the pages of Wilson’s pulp trilogy Written in Blood (1989) or the two volumes of The Killers Among Us (1995) – all published by Bantam in the U.S. – with some disappointment at not having made the finals.
Well, it’s not too late. We know from his memoirs that Colin Wilson has engaged in extensive correspondence with serial-killing Outsiders who have ended up behind bars. It seems like a matter of time before he turns out a book on BTK.
Unless, of course, I beat him to it. The key is to overcome the gray fog of everyday, dull consciousness by activating my dormant reserves of Faculty X. Fortunately it has never been necessary for me to kill anyone to manage this. Two large cups of French Roast will usually do the trick.
Awhile ago at a conference I read a paper to a dog. The subject of the paper was Clarice Lispector's great story, "The Crime of the Mathematics Professor," which is about dogs; the professor abandons his own, and subsequently buries another. So it was fitting that a dog would be in the audience. There were seven humans, including his owner, who was blind.
I don't know if the dog -- a beautiful, black, curly shepherd mix named Mark -- enjoyed the paper. Sprawled at his master's feet, he seemed to sleep the entire time. Yet he duly opened his eyes when I finished reading and the audience began clapping. I hoped he would give forth with a couple big barks. Maybe it was his first conference and he was unsure if a "woof" would be appropriate.
Dogs, after all, aren't normally seen in conference rooms. They aren't expected to join professional organizations, they aren't invited to give papers. Just so, they aren't welcome in the other major academic space: the classroom. A dog might be the object of knowledge (it turns out that the representation of animals is something of a hot topic at present) but it does not constitute a subject. Seeing eye-dogs aside, no dog in the United States to my knowledge either regularly attends or teaches a class.
Too bad. Years ago a dog wandered into my second-floor classroom late one morning, occasioning great delight among the students. "I gave him an A last semester," I remarked. Perhaps at these words delight lessened a bit. There are limits to the possible relations between ourselves and even the most familiar of animals, which reminds me that one of my above panelists was also named "Mark" and seemed uneasy to be in such nominative proximity to the dog.
Reading a paper to Mark (both of them) prompted me to try to recall other conference appearances of an animal in my experience. I could remember none. I do recall now the first time I saw a woman nursing a baby. (Evidently a proud assertion of womanly power; nowadays, it's more common to see the baby but not the nursing.) But of course babies are humans. Even though it could be argued that neither belongs in a conference room, each doesn't in quite different ways.
Back to the classroom: once a student told me that she had seen a classmate bring a snake to class. I never actually saw the snake. (Nor the student, for that matter.) So it's gone with nonhuman beings. With the exception of the one dog, whose name I never learned, my classroom for better or worse has been exclusively human. It seems idle to mention a few furtive cockroaches who scurried across the floor, a lizard or two immobile along various walls, or the odd disruptive bee who flew in through an open window.
No animals. What exactly could be at stake in this fact, other than yet another lamentable example of what animal rights people would term "species-ism?" Principally, two things. Each one illuminates the nature of the principal public spaces in which our professional lives are played out. Both abide according to a profound analogy to theatrical space. Nothing provokes this realization like the example of a dog in a classroom.
First, what is kept out of the classroom is just as decisive to what transpires there as what is admitted in. Indeed, the classroom can be conceived of as a site of constant struggle to get rid of things that don't belong -- cell phones, students missing from the roll, and so on. At the pedagogical, if not phenomenological, limit, animals certainly don't belong! And yet, when a dog strays in, what we discover is that it does belong, or rather, can be made to belong, affording amusement and even instruction to all, at least for a time.
I take my own instruction from a chapter, "The World on Stage," in Bert O. States's wonderful little book on theater, Great Reckonings in Little Rooms. The dog on the stage (which was the original title of this chapter) is "a nearly perfect symptom of the cutting edge of theater, the bite that it takes into actuality in order to sustain itself in the dynamic order of its own ever-dying signs and images."
But of course, unlike the stage, the classroom is, alas, not nearly so dynamic. Its space is far more conservative. Too much "actuality," in the person of a dog, only defeats its purpose, which is ultimately intellectual rather than aesthetic.
Conference space is more theatrical. While the burden of the classroom is to keep out the amplitude or variety of conferences, conference sessions themselves are pleased to admit more of actuality -- people arriving late and leaving early, babies, and even dogs. Indeed, it seems very right for conference programs to include poetry readings, theatrical performances, or special lectures by stars right alongside the usual sessions. Back on campus, readings or lectures are more aligned with the one or two plays offered by the drama department each semester.
There is a second thing about the spectacle of the dog in the classroom that follows from the first: the inescapable, mysterious presence of what has been excluded from the classroom. Each semester we do battle anew not only with cell phones or individuals not on the roll but a whole host of other things, ranging from students in back who won't stop talking to construction noise right outside the window.
Classes in which everybody belongs on the roll and always raises her hand before speaking are a great blessing. Yet day by day or week by week these classes can also be a great bore. Citing Artaud, States writes of "a theater that brings us into phenomenal contact with what exists, or rather what it is possible to do, theatrically, with what exists." Pedagogically, by contrast, our classes do too little; there are days when everybody seems to suffer from what might be called phenomenological poverty. The rattling sound of a lawnmower outside seems momentarily welcome. A buzzing insect inside can seem numinous.
But the wagging tail of a dog? Conceived of in States as "at the lowest echelon of living things that come on stage tethered to the real world," why not a dog? What the stage does with this dog is immediately transform it into a "sign" (albeit a special, fascinating one, given the tether). What the classroom does with a dog, on the other hand, is to seek to banish it.
This is to be lamented. To lose "the real world" is too great a loss, which is why it is such a recurrent moment in education to bemoan the loss, however conceived. A dog constitutes one especially provocative example. I would not have the classroom become a kennel. But I would have the classroom be more like a stage, where dogs don't appear always under the sole sign of "disruption."
Our classrooms should be more like stages, because they already are theatrical in nature. They don't all aspire to be "cutting edge" (any more than all stage productions do). Yet signs of the correspondence between classroom and stage are everywhere, ranging from the "wardrobe" of students and teachers to the material presence of "props" and the role-governed nature of "dialogue."
Does the best teaching embrace this correspondence? Probably -- in all sorts of ways. I fret about my own teaching when I fear my own inner classroom has become too narrow, bent on excluding everything rather than doing something with anything, even students who leave "to go to the bathroom."
There are days when I wish a dog would come prancing or slobbering in. (I never imagine him growling; disruption is one thing, while danger is quite another.) There are times when the day's syllabus-authorized discourse needs to become more wayward. Especially when we have all been variously engaged in good behavior, what we can always use -- to continue with a distinction States makes -- is the shock of what the stage animal always gives: behavior only.
Perhaps back at the conference this is why Mark (the dog) moved me so. Not only was he incapable of obligatory "appreciation" of my paper. (It's almost impossible for a human being to merely "behave" at a conference or in a classroom.) He suggested possibilities I had never considered, such as reading the paper to an entire audience of dogs, or, perhaps better, listening to a paper on the same story read by a dog.
Silly? Of course. Yet such scenes might be hilarious in a play. The reason is not fundamentally different from why the sudden appearance of a dog in the classroom initially elicits laughter. A classroom is not the same thing as a conference room, but the difference dissolves when each is reborn, vividly, as a stage. In order to hasten this rebirth, I would have our classrooms go to the dogs.
Terry Caesar's last column was in praise of librarians.
The story goes that parents get their due once their children have children. The grown children find out about all the hard work and sacrifice it takes to be a parent, and then finally appreciate what their own parents went through.
Well, there’s promising news for teachers, too, from their educated “offspring” who go on to become teachers themselves. I got a big dose of this deferred payback recently when I became an assistant professor at a private urban university.
The transition from Ph.D. student to university professor was abrupt for me because I continued in my profession as a journalist during graduate school rather than working as a teaching assistant. My large, state university had no instruction in teaching, so I figured competence in the classroom just sort of came naturally to those of us who had studied and thought deeply within the discipline. I had assumptions about teaching based on my own experiences on the receiving end, which I realize now is kind of like judging what kind of writer you might be by the books you’ve read. I had the vague idea that I’d pass on my own enlightenment as a graduate student to a fairly receptive audience. I’ll pause here to give the experienced educators who are reading this a chance to stop laughing.
After just eight weeks with a full course load, it’s an understatement to say my thoughts about teaching have become more, well, focused. I find myself harking back to my own experience on the receiving end again, but this time as an undergraduate like the students I now teach.
I think what has taken me back are the blank stares, heads on desks, and absentees in my classroom. As I struggle with teaching in ways I wasn’t expecting, I guess I’m a bit defensive and feeling sorry for myself. I sometimes think I don’t deserve what I’m getting, just as my undergraduate professors didn’t deserve what I gave.
But it’s probably a good thing I’m thinking about my own bad behavior as an 18-to-20-year-old. The optimist battling these pessimistic feelings believes such memories might be a first step toward focusing on the students in this process, instead of myself. Like a transgressor at an AA meeting, I want to stand up and cleanse my soul, hopefully to get rid of the guilt I feel when I think of what I did to others, because now it’s happening to me. “Okay, okay, I get it,” I want to say to my professors of old. “I was a twit.”
I usually come to this confessional frame of mind at the end of each day as I trudge to my car toting my 60-pound bag of books, folders, oversize place cards (put on desks to learn students’ names as quickly as possible; novice teaching tip No. 8,709), DVDs, videos, laptop, cables, grant forms, research proposals, insulated coffee cup, and stacks of papers to grade. During this evening ritual, I conjure the image of myself sitting at a large, uncluttered writing desk, after plenty of rest and with unbounded time, to pick up a sharpened quill, dip it in ink, and pen a formal apology to those educators who had me in their classes at what was then Northeast Louisiana University in Monroe, La., from September 1971 to May 1975.
The letter reads something like this:
Dear Dr. Carroll, et al:
You probably don’t remember me, and it may really be too late at this point, but I wanted to write to tell you how sorry I am about the way I conducted myself in your classroom (and, in general, during my early college years, but I’ll limit this to academics).
If you’re still teaching, my hat goes off to you! If not, I hope you have been able to look back with satisfaction on your teaching career, despite my presence.
I know I’m not one of those students you might think about, or even still talk about, when searching your memory for a rewarding experience to help acknowledge all the hours and effort you put into your lectures and presentations. You probably gladly forgot my name the instant the semester was over.
I wanted to let you know, though, that it sometimes takes a long time for a student to appreciate what lessons he or she learns (as we know these days, we all have different learning styles!). As an educator myself now, I’m certainly learning mine.
While you might not recall my presence specifically, you do know me. I’m one of those who sat in the back of the room, avoided your eyes because I hadn’t read the material, and said little. If you made an effort to call on me, I deferred with a mumbled, “I don’t know.”
If you were the professor for the history class I had after physical education, I was the one who often fell asleep with my head on my desk, once even drooling on my notebook.
If you were my speech teacher, I was the one who came unprepared for my presentation and rambled beyond my 10-minute time limit by 15 minutes, never getting to the point.
If you were my English teacher, I never cracked the spine on Beowulf and I complained about my grade despite missing class regularly.
If you were my journalism adviser, I avoided doing my work for the news service during my designated hours, despite getting paid.
If you were my zoology professor, I got an A on that test because you happened to give the same one you had given two years before, and a friend of mine had a copy.
If it makes you feel any better, I’ve now had the experience of looking out on a sea of blank faces and wondering if I am the only one in the room who has read the material.
It might help to know that half my students in one course skipped class the meeting following the midterm, and half of another class acted like insolent 12-year-olds when they got Bs and A minuses on their tests.
So far, in my first semester, I’ve had four grandmothers die, six hospitalizations, countless numbers of colds and flu (flu season must have started in September this year), two cases of mono, three cases of sick friends who couldn’t get themselves to a doctor, and one honest “I overslept” for a 12:45 p.m. class.
Perhaps you can take comfort in the numerous times in my classes in which students’ heads have dropped and gone back up, dropped and gone back up, driven seemingly by the same laws of thermodynamics as those bobbing glass birds found in novelty shops. Maybe the shattering of my naïve illusions about imparting my higher degree-conferred wisdom in a way that would captivate youthful minds will make you gloat. I wouldn’t blame you.
It’s more likely, if I know you, that you’ll sympathize, though, and do what I’ve done – realize it’s probably not about my knowledge, or their lack of sleep or interest on any particular day. It’s at least partly about setting my own expectations, aspirations, and frustrations aside and trying to notice when they do things right, or perhaps more importantly, when they do the right thing. And, it’s about asking their opinion.
There is perhaps nothing so humbling as standing in front of a crowded room of 18-to-20-year-olds and asking them to tell you, anonymously and with forethought, what’s wrong with you. That’s exactly what colleges and universities do each semester with teacher evaluations, and the outcomes count for a lot.
I decided to take a colleague’s advice and try to get feedback by using my own private, mid-semester survey, and, to my surprise, my students offered constructive suggestions and even sympathy for the difficulty of making journalism history interesting. They gave me attaboys for effort, even puzzling over why a certain lesson didn’t work despite my obvious enthusiasm for the topic.
One of my students came to see me, in part, to buck me up about a class. She blamed lack of participation on uncaring classmates, her classmates, whom she suggested didn’t care much about a general education class they were required to take. She told me it was obvious I was trying hard and she did her best to make me feel better. Maybe I’m naïve, but she seemed sincere, and no grade was hanging in the balance. I later found out one of her own absences from class was due to the fact that her mother was dying of cancer. Her actions are in stark relief to mine as a post-teen.
I took heart in my student’s rationalization about the class, but somehow, as the semester has progressed and my comfort level and interactions with students have increased, the once deadly atmosphere has livened up, and students are participating more. There are fewer naps and downcast eyes. Could it be the students weren’t the problem after all? It’s a colossal understatement to say that’s a possibility.
So, I guess I have more to atone for than I thought – my past sins as a student and my current ones as a teacher facing challenges in the classroom from students who are much like my younger self, and just as likely (hopefully) to actually be affected by the way I treat them as I was by how you treated me – even if it took decades to realize it.
I hope you’ll accept my apology for my behavior. I’d also like to thank you for yours. Your job is harder, and more fulfilling, than I ever imagined.
--Danna L. Walker
Danna L. Walker
Danna L. Walker is an assistant professor of communications at American University.