Arts

Brandeis Will Keep Its Art

Smart Title: 

University settles lawsuit and state inquiry by pledging to enhance a noted collection, not to sell it.

I'm Not Really a Professor; I Just Play One on TV

When I was just a few years out of graduate school I wrote a “treatment” for a television series to be called “The Young Professors.” The show tracked the adventures of three new assistant professors as they negotiated the ins and outs of life at Soybean State College, a medium-sized, teaching-intensive public institution somewhere in the Midwest.

This was in the mid-1970s, when series about young doctors and lawyers were big. So, knowing that nothing succeeds in commercial TV like a knock-off, and hoping for a source of income other than teaching summer school, I took a crack at putting college on the small screen. While I billed the show as “The Halls of Ivy” meets “The Mod Squad,” my project had no legs. My plots were not exactly ripped from the headlines, and although any resemblance between the characters that I depicted and actual persons living or dead was purely coincidental, or so I claimed, those characters weren’t going to keep viewers from changing the channel. A small flurry of interest from a local public broadcaster led nowhere, and now the yellowing, typed pages of “The Young Professors” sit in a folder with the rest of my juvenilia.

My efforts to “write what you know” notwithstanding, the classroom remains an occasional backdrop for television, and while the successful shows from the medium’s golden age, like “Our Miss Brooks,” “Mr. Peepers,” and “Room 222,” portray a strikingly unrealistic version of the high school experience, occasionally we’re treated to a media distortion of college life as well.

To be fair, we may pretend that the media tells it like it is, but we know very well that even reality television is far from real. Cop shows are notorious for misrepresenting life on the streets, and lawyer series fail to capture the highly-nuanced world of torts and contracts. Life in the E.R. is not all high concept relieved by short commercial breaks. And Ralph Kramden was no ordinary bus driver. But college professors don't even get the kind of attention we lavish on Cosmo Kramer or Archie Bunker. TV hasn't brought us the Dead Deconstructionists' Society, or anything that looks at college from a faculty perspective. Woody Allen occasionally reduces Brandeis to a cultural stereotype in his movies, and there was a popular TV series about life of a pre-med at the fictional University of New York, but don’t hold your breath for a dramatization of Harvard MBA’s-in-training, not to mention the Sarah Lawrence experience.

When it does notice higher education, television can treat it as comic. For example, there’s Ross, the museum paleontologist who once taught an evening class on “Friends,” and who does wind up on the NYU faculty . But more often than not, TV prefers a lurid look at campus life. Hudson University, a recurring venue on “Law and Order” and its spin-offs, is more a source of L&O’s killers and corpses than it is a provider of expert witnesses to testify in court.

The hijinks at Hudson are a far cry from swallowing goldfish or stuffing frat boys into a Volkswagen. At Hudson’s labs, researchers are killed by animal rights activists when they’re not too busy subjecting students to trials of dangerous experimental drugs without informed consent. Hudson undergrads regularly lose roommates to murder, and grad students occasionally kill their advisors, or are killed by them, often after having sex without informed consent.

In fact, many of L&O’s higher ed plots revolve around “Sex and the City University.” In one episode a grisly murder leads the police to a Hudson anthropologist who’s desperately trying to hide from his wife his unhealthy appetite for young boys. In another, the president of Hudson is bludgeoned to death by a serial murderer from Australia masquerading as an English don. Instead of acknowledging her misdeeds and copping to “man 2,” the fake Englit specialist must face nonrenewal of her contract and the disappointment of her lesbian lover, who also happens to be her dean.

In contrast to the gritty reality of the L&O classroom cum crime scene, the 1950s half-hour sitcom, “The Halls of Ivy,” presents a bucolic Ivy College that is nothing like Hudson University. “The Halls of Ivy” starred Ronald Coleman and his wife Benita Hume as William Todhunter Hall, the genial, urbane president of Ivy College, in the town of Ivy, somewhere in the Midwest, and his wife Victoria Cromwell Hall, the college president’s equally genial and urbane wife. After a successful radio run from 1949-1952, “The Halls of Ivy,” following the earlier lead of “Our Miss Brooks,” migrated to television in 1954.

“The Halls of Ivy” had great promise and strong backing (it was one of the most expensive TV series of its day). But while Eve Arden’s portrayal of Constance Brooks, everyone’s favorite high school English teacher, captivated viewers for four years, America wasn’t ready for a show about college foibles, and “The Halls of Ivy” lasted only one season. Although created by one of the lead writers for the populist radio series “Fibber McGee and Molly,” the plots on ”The Halls of Ivy” were too high-brow for the television audience. Indeed, many of the 38 episodes that CBS aired (seasons were longer then) could have been ripped from the headlines, provided they were the headlines of Inside Higher Ed. One TV history says the show flopped because it was too literate and lacked action.

On the other hand, the fact that "The Halls of Ivy" drew a national audience at all was itself a cultural phenom. America was undergoing one of its most intellectually deadly moments at the time, with universities the target of rabid red baiters. On top of that, “The Halls of Ivy” dealt with issues that were surely sensitive in 1954 and are still pressing and controversial in the academy today: racism (in one episode, a Chinese student is ostracized by classmates and runs away); athletics vs. education (in another, the cross-country star quits the team because track is taking too much time away from his studies; in a third, a top pre-med student wants to give up dreams of the O.R. up to become a professional boxer). The show even dealt indirectly with gender stereotyping: while Vicky often plays the role of ditzy sidekick to her husband’s competent-administrator pose, a minute later she’ll put on her "Murphy Brown" persona and trade literate barbs with Toddy like any self-assured and hip chronicler of human foibles.

To be fair, whole episodes of “The Halls of Ivy” were devoted to issues over which the American public both then and now might be expected to yawn: the eroding faculty/student ratio; professors who aren’t publishing; a candidate for a named chair who might be a fraud; a department in danger of being closed because of low enrollments. There was even a half hour devoted not to drug trafficking, a problem that’s endemic at L&O’s Hudson University, but to traffic congestion on Ivy’s no-longer-sleepy campus, and Dr. Hall’s attempts to dodge a ticket he got from the college police. I doubt if even the readers of Inside Higher Ed would have patience to sit through those segments today, compelling as they seemed to the show’s producers at the time.

But “The Halls of Ivy” did deal unashamedly with the facts of college life. While Ivy’s president is in no danger of being bludgeoned to death, in the pilot episode of “The Halls of Ivy,” President Hall nervously waits while the board of Ivy College votes on renewing his contract. Hall does get the nod, but a short three episodes later we see that the president is still insecure: Toddy and Vicky hurriedly throw together a meal without letting on that their guest -- the fussbudget chairman of the governing board -- has come to dinner one night too early. In other episodes, Hall sweet-talks an eccentric donor who demands that the college display one of her sculptures in exchange for a new gymnasium, and he must find a way to avert a mandatory faculty retirement that will be disastrous for the college. In two episodes Hall deals with the problems of what we now call returning or nontraditional students, but in the blunt language of the 50s were simply old folks going back to school. Later in the season he intervenes to quell rumors that the new Latin professor is a sexual predator. And in another segment, Hall deftly confronts the problem of an honor student who is about to be expelled because she never finished high school. Toward the end of the show’s run, Hall puts on kid gloves to handle a gangster come to campus to find out why his nephew was kicked off the football team.

It wasn’t the limited-interest plots or the controversial issues that kept “The Halls of Ivy” audience coming back week after week. It was the finely-tuned scripts and the ensemble acting, and the show’s theme song, a Whiffenpoof-style chorale that managed to enjoy some commercial success independent of the series. But while Dr. Hall had no trouble convincing the board to renew his contract, it soon became clear that Ronald Coleman had done far, far better things than portray a college president, and the audience eventually dwindled to a point where it was too small to warrant CBS picking up the show’s option for a second season. As the redeemed Latin prof might have put it, “De gustibus non disputandum est.”

I was 10 when “The Halls of Ivy” aired, and though I knew nothing of academia I looked forward to the show each week. It still seems to me that academic life should generate at least as much public interest as infomercials for food processors or exercise equipment, but perhaps it was best that “The Halls of Ivy” bowed out gracefully. And while it was nice for me to dream that the “The Young Professors” would one day generate as many spin-offs as “Law and Order,” which seems to be on one cable channel or another every hour of the day, I know that it’s best that my show never got off the ground. Even public-access cable channels aren’t ready for a series focusing on tenure, struggles over who gets the nice office, or endless committee meetings. “The Young Professors” would have been, in effect, a show about nothing, and as such it was no doubt far ahead of its time.

While I don’t expect television to portray college faculty with docudrama accuracy, I still believe there’s a role for higher education on TV beyond “Law and Order,” “The College Bowl”-style quiz, the “Book World” interview, or the CNN talking head. For some reason, the movies are more likely to get it right, with films that show professors as just like everybody else, only more so, like "The Blue Angel," "Good Will Hunting" and "A Beautiful Mind."  Most academics are not pretentious boors or stuffed shirts like the one in “Annie Hall” who expounds vacuously on Marshall McLuhan’s theories while standing in a movie line, prompting an exasperated Woody Allen to bring out the real Marshall McLuhan to chide him. McLuhan was also an academic, by the way, though clearly not much of an actor. And few of my colleagues have the get up and go of Indiana Jones, a movie professor who can turn into a villain-bashing superhero just by taking off his glasses and putting on a pith helmet.

Instead of reducing us all to cultural stereotypes, it would be nice to see shows in which professors contribute to the solution rather than the cause of crimes, have sex lives which are dramatically compelling without being criminally dysfunctional, or participate in witty sitcoms like “The Halls of Ivy,” or what I hoped that the “The Young Professors” might become. While there’s no “Law and Order” network, at least not yet, there are entire networks devoted to animal antics, do-it-yourself projects, and city council meetings. Surely a show about professors could find a niche on one of the 500-cable channels while being both too literate and lacking action.

Author/s: 
Dennis Baron
Author's email: 
info@insidehighered.com

Dennis Baron is a professor of English and linguistics at the University of Illinois at Urbana-Champaign.

The Kircher Code

The table sits at the front of the bookshop, near the door. That way it will get maximum exposure as people come and go. "If you enjoyed The Da Vinci Code," the sign over it says, "you might also like..." The store is part of a national chain, meaning there are hundreds of these tables around the country. Thousands, even.

And yet the display, however eyecatching, is by no means a triumph of mass-marketing genius. The bookseller is denying itself a chance to appeal to an enormous pool of consumer dollars. I'm referring to all the people who haven’t read Dan Brown’s globe-bestriding best-seller -- and have no intention of seeing the new movie -- yet are already sick to death of the whole phenomenon.

"If you never want to hear about The Da Vinci Code again," the sign could say, "you might like...."

The book’s historical thesis (if that is the word for it) has become the cultural equivalent of e-mail spam. You just can’t keep it out. The premise sounds more preposterous than thrilling: Leonardo da Vinci was the head of a secret society (with connections to the Knights Templar) that guarded the hidden knowledge that Mary Magdeleine fled Jerusalem, carrying Jesus’s child, and settled in France....

All of this is packaged as a contribution to the revival of feminine spirituality. Which is, in itself, enough to make the jaw drop, at least for anyone with a clue about the actual roots of this little bit of esoteric hokum.

Fantasies about the divine bloodlines of certain aristocratic families are a staple of the extreme right wing in Europe. (The adherents usually also possess "secret knowledge" about Jewish bankers.) And anyone contending that the Knights Templar were a major factor behind the scenes of world history will turn out to be a simpleton, a lunatic, or some blend of the two -- unless, of course, it’s Umberto Eco goofing on the whole thing, as he did in Foucault’s Pendulum.

It's not that Dan Brown is writing crypto-fascist novels. He just has really bad taste in crackpot theories. (Unlike Eco, who has good taste in crackpot theories.)

And Leonardo doesn’t need the publicity -- whereas my man Athanasius Kircher, the brilliant and altogether improbable Jesuit polymath, does.

Everybody has heard of the Italian painter and inventor. As universal geniuses go, he is definitely on the A list. Yet we Kircher enthusiasts feel duty-bound to point out that Leonardo started a lot more projects than he ever finished -- and that some of his bright ideas wouldn’t have worked.

Sure, Leonardo studied birds in order to design a flying machine. But if you built it and jumped off the side of a mountain, they’d be scrapping you off the bottom of the valley. Of course very few people could have painted "Mona Lisa." But hell, anybody can come up with a device permitting you to plunge to your death while waving your arms.

Why should he get all the press, while Athanasius Kircher remains in relative obscurity? He has just as much claim to the title of universal genius. Born in Germany in 1602, he was the son of a gentleman-scholar with an impressive library (most of it destroyed during the Thirty Years’ War). By the time Kircher became a monk at the age of 16, he had already become as broadly informed as someone twice his age.

He joined the faculty of the Collegio Romano in 1634, his title was Professor of Mathematics. But by no means is that a good indicator of his range of scholarly accomplishments. He studied everything. Thanks to his access to the network of Jesuit scholars, Kircher kept in touch with the latest discoveries taking place in the most far-flung parts of the world. And a constant stream of learned visitors to Rome came to see his museum at the Vatican, where Kircher exhibited curious items such as fossils and stuffed wildlife alongside his own inventions.

Leonardo kept most of his more interesting thoughts hidden in notebooks. By contrast, Kircher was all about voluminous publication. His work appeared in dozens of lavishly illustrated folios, the publication of which was often funded by wealthy and powerful figures. The word "generalist" is much too feeble for someone like Kircher. He prepared dictionaries, studied the effects of earthquakes, theorized about musical acoustics, and engineered various robot-like devices that startled tourists with their lifelike motions.

He was also enthusiastic about the microscope. In a book published in 1646, Kircher mentioned having discovered “wonders....in the verminous blood of those sick with fever, and numberless other facts not known or understood by a single physician.” He speculated that very small animals “with a vast number and variety of motions, colors, and almost invisible parts” might float up from from “the putrid vapors” emitted by sick people or corpses.

There has long been a scholarly debate over whether or not Kircher deserves recognition as the inventor of the germ theory of disease. True, he seems not to have had a very clear notion of what was involved in experimentation (then a new idea). And he threw off his idea about the very tiny animals almost in passing, rather than developing it in a rigorous manner.  But then again, Kircher was a busy guy. He managed to stay on the good side of three popes, while some of his colleagues in the sciences had trouble keeping the good will of even one.
Among Kircher’s passions was the study of ancient Egypt. As a young man, he read an account of the hieroglyphics that presented the idea that they were decorative inscriptions -- the equivalent of stone wallpaper, perhaps. (After all, they looked like tiny pictures.) This struck him as unlikely. Kircher suspected the hieroglyphics were actually a language of some kind, setting himself the task of figuring out how to read it.

And he made great progress in this project – albeit in the wrong direction. He decided that the symbols were somehow related to the writing system of the Chinese, which he did know how to read, more or less. (Drawing on correspondence from his missionary colleagues abroad, Kircher prepared the first book on Chinese vocabulary published in Europe.)

Only in the 19th century was Jean Francois Champollion able to solve the mystery, thanks to the discovery of the Rosetta Stone. But the French scholar gave the old Jesuit his due for his pioneering (if misguided) work. In presenting his speculations, Kircher had also provided reliable transcriptions of the hieroglyphic texts. They were valuable even if his guesses about their meaning were off.

Always at the back of Kircher’s mind, I suspect, was the story from Genesis about the Tower of Babel. (It was the subject of one of his books.) As a good Jesuit, he was doubtless confident of belonging to the one true faith -- but at the same time, he noticed parallels between the Bible and religious stories from around the world. There were various trinities of dieties, for example. As a gifted philologist, he noticed the similarities among different languages.

So it stood to reason that the seeming multiplicity of cultures was actually rather superficial. At most, it reflected the confusion of tongues following God’s expressed displeasure about that big architectural project. Deep down, even the pagan and barbarous peoples of the world had some rough approximation of the true faith.

That sounds ecumenical and cosmopolitan enough. It was also something like a blueprint for conquest: Missionaries would presumably use this basic similarity as a way to "correct" the beliefs of those they were proselytizing.

But I suspect there is another level of meaning to his musings. Kircher’s research pointed to the fundamental unity of the world. The various scholarly disciplines were, in effect, so many fragments of the Tower of Babel. He was trying to piece them together. (A risky venture, given the precedent.)

He was not content merely to speculate. Kircher tried to make a practical application of his theories by creating a "universal polygraphy" -- that is, a system of writing that would permit communication across linguistic barriers. It wasn’t an artificial language like Esperanto, exactly, but rather something like a very low-tech translation software. It would allow you to break a sentence in one language down to units, which were to be represented by symbols. Then someone who knew a different language could decode the message.

Both parties needed access to the key -- basically, a set of tables giving the meaning of Kircher’s "polygraphic" symbols. And the technique would place a premium on simple, clear expression. In any case, it would certainly make international communication faster and easier.

Unless (that is) the key were kept secret. Here, Kircher seems to have had a brilliant afterthought. The same tool allowing for speedy, transparent exchange could (with some minor adjustments) also be used to conceal the meaning of a message from prying eyes. He took this insight one step further -- working out a technique for embedding a secret message in what might otherwise look like a banal letter. Only the recipient -- provided he knew how to crack the code -- would be able to extract its hidden meaning.

Even before his death in 1680, there were those who mocked Athanasius Kircher for his vanity, for his gullibility (he practiced alchemy), and for the tendency of his books to wander around their subjects in a rather garrulous and self-indulgent manner. Nor did the passing of time and fashion treat him well. By the 18th century, scholars knew that the path to exact knowledge involved specialization. The wild and woolly encyclopedism of Athanasius Kirscher was definitely a thing of the past.

Some of the disdain may have been envy. Kircher was the embodiment of untamed curiosity, and it is pretty obvious that he was having a very good time. Even granting detractors all their points, it is hard not to be somewhat in awe of the man. Someone who could invent microbiology, multiculturalism, and encryption technology (and in the 17th century no less) at least deserves to be on a T-shirt.

But no! All anybody wants to talk about is da Vinci. (Or rather, a bogus story about him that is the hermeneutic equivalent of putting "The Last Supper" on black velvet.)

Well, if you can’t beat 'em.... Maybe it's time for a trashy historical thriller that will give Kircher his due. So here goes:

After reading this column, Tom Hanks rushes off to the Vatican archives and finds proof that Kircher used his "universal polygraphy" to embed secret messages in his the artwork for his gorgeously illustrated books.

But that’s not all. By cracking the code, he finds a cure to the avian flu. Kircher has recognized this as a long-term menace, based on a comment by a Jesuit missionary work. (We learn all this in flashbacks. I see Phillip Seymour Hoffman as Athanasius Kircher.)

Well, it's a start, anyway. And fair warning to Dan Brown. Help yourself to this plot and I will see you in court. It might be a terrible idea, but clearly that's not stopped you before.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Beach Blanket Bingo

Entertainment is in the eye of the beholder. Consider the case of what are usually called “beach novels” -- bulky sagas of lust, money, and adventure, page-turning epics of escapism that are (it’s said) addictive. I’ve never been able to work up the appetite to read one, even while bored on vacation in a seafront town. Clive James characterized the dialogue of one such novelist as resembling “an argument between two not-very-bright drunks.”

Which might be fun to witness in real life, actually, depending on the subject of the dispute. But reading the transcript seems like an invitation to a bad headache.

Diversion doesn’t have to be mind-numbing, let alone painful. With the end of the semester at hand, then, a few recommendations of recent books and DVDs that are smarter than your average bar fight -- and more entertaining.

The two dozen or so contributors to When I Was a Loser: True Stories of (Barely) Surviving High School managed to wear the entire range of unfortunate hair styles available throughout the 1970s and ‘80s. This collection -- edited by John McNally, who spent last semester as a visiting writer at Columbia College Chicago -- is one of the less solemn works of “creative nonfiction” (as the term of art now has it) currently available. Published by the Free Press, it is available in both paperback and e-book formats.

Most of the mortified authors are novelists and poets, ranging in age from their early 30s through their late 40s. It’s not that their memoirs are devoted to mullets or feathering, as such. But the stories they have to tell are all about the pressure to fit in, to be cool -- failure to do so bringing various penalties, as you may recall. There, on the cusp of adulthood, one has the first opportunity to create a new self. And hair is where it tends to happen first. Sex, religion, and first-job experiences also have their place.

With the benefit of hindsight, of course, the whole effort can seem embarrassing. The essays in When I Was a Loser are all about the different grades of self-consciousness and awkwardness. A few are lushly overwritten (adolescence is a purple thing) and one or two seem more than a little fictionalized. But most have the feel of authentically remembered humiliation, now rendered bearable by time and the cultivation of talent.

Several are well-known, including Dean Bakopoulos, whose novel Please Don't Come Back from the Moon was named by The New York Times as one of the notable books of 2005, and the prominent literary blogger Maud Newton. In the spirit of full disclosure, it bears mentioning that Maud is a friend, and her essay "Confessions of a Cradle Robber" (revealing the dark shame of having once been a fourteen year-old girl with a boyfriend who was twelve) was the first thing I read. My other favorite piece here was "How to Kill the Boy that Nobody Likes" by Will Clarke, a novelist who recalls being the most despised kid in junior high -- one nicknamed "The Will-tard" for his admittedly peculiar comportment. Clarke's rise to the status and celebrity of Student Council treasurer is a tribute to the power of a very silly 1970s paperback about the secret techniques of subliminal advertising. The author's name didn't ring a bell when I picked the book up, but it certainly will in the future.

Adolescence isn’t just for teenagers any more. "Twitch City," an absurdist sitcom that premiered on Canadian television in 1998, offers one of the funniest portraits around of someone determined to avoid the demands of adult life.It ran through 13 episodes before the show ended in 2000. The recent DVD release doesn’t provide many features. Still, it’s good to have the whole series available to those of us who weren’t part of its original cult following.

Its central character, Curtis (played by Don McKellar), is a man in his 20s who spends nearly every waking hour watching television. Among his few distractions from distraction is the effort to sublet more and more of his grungy apartment to anyone who can help him make the rent. His girlfriend Hope (played by the luminous Molly Parker) works at a variety of low-paying jobs. She can never quite figure out why she’s attracted to someone not just utterly lacking in ambition but unwilling even to leave the couch.

Part of the pleasure of "Twitch City" comes from seeing just how many stories can be generated around such a constrained, even claustrophobic premise. It is minimalist without being repetitive, and plausible, somehow, in spite of being preposterous.

When a chain of odd circumstances makes Curtis a media celebrity, he is visited by a woman (Jennifer Jason Leigh) claiming to be a graduate student in semiotics. She interviews him about his habits and outlook, and he delivers an analysis of the aesthetics of “Gilligan’s Island” that is a real tour de force -- a great moment of meta-TV. "Twitch City" is set in a neighborhood of Toronto, which occasionally made me wonder what Marshall McLuhan (who taught at U of T) would have made of it.

Another product of Canada worth a look is "Slings and Arrows," an ensemble comedy/drama that just finished its third and final season on the Sundance Channel. The first two (each consisting of six one-hour episodes) are now available on DVD.

Set at a repertory theater best known for its Shakespeare productions, "Slings and Arrows" is in some ways a show about trying to keep viable routines from turning into a rut of mediocrity. The theater’s regular audience is aging. It buys its season tickets out of force of habit, mostly. But box office sales aren’t what they could be, and it’s hard to find corporate sponsors who won’t try to meddle with how the place is run. And in any case, the troupe’s creative spark has diminished over time.

Revitalization isn’t impossible, but it takes some doing. Each season tracks the production of a different Shakespeare play ( Hamlet, Macbeth, and King Lear) with a keen eye and ear for the way the artistic director and the actors work out the staging. At the same time, plenty of drama and farce takes place behind the scenes.

People who have worked in theater tell me that the situations and backstage dynamics in "Slings and Arrows” are absolutely typical of professional productions. As much as I enjoyed the first season, it was hard to believe that the second would be anything beyond a repetition -- reducing success to a formula. But those misgivings were completely off track. The third season carried things to a natural close.

Nowadays there are sessions at the Modern Language Association meeting devoted to the great German literary theorist Walter Benjamin, whose selected writings have appeared in English in four hefty volumes from Harvard University Press. But if the man himself showed up and wandered the corridors, I doubt he would survive the usual quick and dismissive nametag-check. After all, he wrote mostly for magazines and newspapers. He’d be wearing the wrong kind of nametag to be worth anybody’s time.

Whether or not Howard Hampton is actually the reincarnation of Walter Benjamin, they have the same extraterritorial position vis-a-vis academic criticism. (Hampton writes for The Village Voice, Film Comment, and The Boston Globe, among other endnote-free zones.) And now they share the same publisher, with the recent appearance of Born in Flames: Termite Dreams, Dialectical Fairy Tales, and Pop Apocalypses (Harvard University Press).

Drawn from 15 years’ worth of running commentary on film, music (mostly rock), and books, Hampton’s selected essays transcend “mere reviewing” (as it’s called) to become examples of a fully engaged critical intelligence responding to the mass-media surround. Some of the best pieces are compact but sweeping analyses of changes in sensibility, amounting to miniature works of cultural history.

One example is “Reification Blues: The Persistence of the Seventies,” which listens to how the pop soundtrack of that decade left its mark on later music despite (or maybe because of) artists’ best efforts to forget it. Another case is “Whatever You Desire: Movieland and Pornotopia” -- an analysis of how mainstream Hollywood and pornography have shaped one another over the years, whether through mimicry or rejection of one another’s examples.

The curse of a lot of pop-culture commentary is its tendency to move too quickly toward big sociocultural statements -- ignoring questions of form and texture, instead using the film, album, etc., as pretext for generalized pontifications. That’s not a problem with Born in Flames. It’s a book that helps you pay attention, even to the nuances of Elvis’s performance in "Viva Las Vegas." Perhaps especially to the nuances of Elvis’s performance in "Viva Las Vegas"....

"It's an alternate universe governed by sheer whim," writes Hampton about the King's cinematic ouevre, "untouched by any sense of the outside world." Sounds like the perfect vacation spot.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

C.L.R. James Meets Tony Soprano

Half a century before "The Sopranos" hit its stride, the Caribbean historian and theorist C.L.R. James recorded some penetrating thoughts on the gangster -- or, more precisely, the gangster film -- as symbol and proxy for the deepest tensions in American society. His insights are worth revising now, while saying farewell to one of the richest works of popular culture ever created.

First, a little context. In 1938, shortly before James arrived in the United States, he had published The Black Jacobins, still one of the great accounts of the Haitian slave revolt. He would later write Beyond a Boundary (1963), a sensitive cultural and social history of cricket – an appreciation of it as both a sport and a value system. But in 1950, when he produced a long manuscript titled “Notes on American Civilization,” James was an illegal alien from Trinidad. I have in hand documents from his interrogation by FBI agents in the late 1940s, during which he was questioned in detail about his left-wing political ideas and associations. (He had been an associate of Leon Trotsky and a leader in his international movement for many years.)

In personal manner, James was, like W.E.B. DuBois, one of the eminent black Victorians -- a gentleman and a scholar, but also someone listening to what his friend Ralph Ellison called “the lower frequencies” of American life. The document James wrote in 1950 was a rough draft for a book he never finished. Four years after his death, it was published as American Civilization (Blackwell, 1993). A sui generis work of cultural and political analysis, it is the product of years of immersion in American literature and history, as well as James’s ambivalent first-hand observation of the society around him. His studies were interrupted in 1953 when he was expelled by the government. James was later readmitted during the late 1960s and taught for many years at what is now the University of the District of Columbia.

American Civilization's discussion of gangster films is part of James's larger argument about media and the arts. James focuses on the role they play in a mass society that promises democracy and equality while systematically frustrating those who take those promises too seriously. Traveling in the American South in 1939 on his way back from a meeting with Trotsky in Mexico, James had made the mistake of sitting in the wrong part of the bus. Fortunately an African-American rider explained the rules to him before things got out of hand. But that experience -- and others like it, no doubt -- left him with a keen sense of the country's contradictions.

While James's analysis of American society is deeply shaped by readings of Hegel and Marx, it also owes a great deal to Frederick Jackson Turner’s theory of “the closing of the frontier.” The world onscreen, as James interpreted it, gave the moviegoer an alternative to the everyday experience of a life “ordered and restricted at every turn, where there is no certainty of employment, far less of being able to rise by energy and ability by going West as in the old days.”

Such frustrations intensified after 1929, according to James’s analysis. The first era of gangster films coincided with the beginning of the Great Depression. “The gangster did not fall from the sky,” wrote James. “He is the persistent symbol of the national past which now has no meaning – the past in which energy, determination, bravery were sure to get a man somewhere in the line of opportunity. Now the man on the assembly line, the farmer, know that they are there for life; and the gangster who displays all the old heroic qualities, in the only way he can display them, is the derisive symbol of the contrast between ideals and reality.”

The language and the assumptions here are obviously quite male-centered. But other passages in James’s work make clear that he understood the frustrations to cross gender lines -- especially given the increasing role of women in mass society as workers, consumers, and audience members.

“In such a society,” writes James, “the individual demands an aesthetic compensation in the contemplation of free individuals who go out into the world and settle their problems by free activity and individualistic methods. In these perpetual isolated wars free individuals are pitted against free individuals, live grandly and boldly. What they want, they go for. Gangsters get what they want, trying it for a while, then are killed.”

The narratives onscreen are a compromise between frustrated desire and social control.“In the end ‘crime does not pay,’” continues James, “but for an hour and a half highly skilled actors and a huge organization of production and distribution have given to many millions a sense of active living....”

Being a good Victorian at heart, James might have preferred that the audience seek “aesthetic compensation” in the more orderly pleasures of cricket, instead. But as a historian and a revolutionary, he accepted what he found. In offering “the freedom from restraint to allow pent-up feelings free play,” gangster movies “have released the bitterness, hate, fear, and sadism which simmer just below the surface.” His theoretical framework for this analysis was strictly classical, by the way. James was deeply influenced by Aristotle’s idea that tragedy allowed an audience to “purge” itself of violent emotions. One day, he thought, they would emerge in a new form -- a wave of upheavals that would shake the country to its foundations.

In 6 seasons over 10 years, “The Sopranos” has confirmed again and again C.L.R. James’s point about the gangster is an archetypal figure of American society. But the creators have gone far beyond his early insights. I say that with all due respect to James’s memory – and with the firm certainty that he would have been a devoted fan and capable interpreter.

For James, analyzing gangster films in 1950, there is an intimate connection between the individual viewer and the figure on the screen. At the same time, there is a vast distance between them. Movies offered the audience something it could not find outside the theater. The gangster is individualism personified. He has escaped all the rules and roles of normal life. His very existence – doomed as it is – embodies a triumph of personal will over social obligation.

By contrast, when we first meet Tony Soprano, a boss in the New Jersey mob, he is in some ways all too well integrated into the world around him. So much so, in fact, that it is giving him panic attacks from trying to meet all the demands from juggling the different roles he plays. In addition to being pater of his own brood, residing in a suburban McMansion, he is the dutiful (if put-upon) son in a dysfunctional and sociopathic family.

And then there are the pressures that attend being the competent manager of a successful business with diversified holdings. Even the form taken by his psychic misery seems perfectly ordinary: anxiety and depression, the tag-team heart-breakers of everyday neurosis.

James treats the cinematic gangsters of yesteryear as radical individualists – their crimes, however violent, being a kind of Romantic refusal of social authority. But the extraordinary power of “The Sopranos” has often come from its portrayal of an almost seamless continuum between normality and monstrosity. Perhaps the most emblematic moment in this regard came in the episode entitled “College,” early in show’s first year. We watch Tony, the proud and loving father, take his firstborn, Meadow, off to spend a day at the campus of one of her prospective colleges. Along the way, he notices a mobster who had informed to the government and gone into the witness protection program. Tony tracks the man down and strangles him to death.

At the college he sees an inscription from Hawthorne that reads, “No man ... can wear one face to himself and another to the multitude, without finally getting bewildered as to which one may be true." Earlier, we have seen Tony answer Meadow’s question about whether he is a member of the Mafia by admitting that, well, he does make a little money from illegal gambling, but no, he isn't a gangster. So the quotation from Hawthorne points to one source of Tony’s constant anxiety. But it also underscores part of the audience’s experience – an ambivalence that only grows more intense as “The Sopranos” unfolds.

For we are no more clear than Tony is which of his faces is “true.” To put it another way, all of them are. He really is a loving father and a good breadwinner (and no worse a husband, for all the compulsive philandering, than many) as well as a violent sociopath. The different sides of his life, while seemingly distinct, keep bleeding into one another.

Analyzing the gangster as American archetype in 1950, C.L.R. James found a figure whose rise and fall onscreen provided the audience with catharsis. With “The Sopranos,” we’ve seen a far more complex pattern of development than anything found in Little Caesar or High Sierra (among other films James had in mind).

With the finale, there will doubtless be a reminder – as in the days of the Hays Code – that “crime does not pay.” But an ironized reminder. After all, we’ve seen that it can pay pretty well. (As Balzac put it, “Behind every great fortune, a great crime.”) Closure won’t mean catharsis. Whatever happens to Tony or his family, the audience will be left with his ambivalence and anxiety, which, over time, we have come to make our own.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Good Grief

A few weeks ago, a new edition of the selected works of Edmund Wilson appeared. Another monumental book this season is David Michaelis’s Schulz and Peanuts: A Biography (HarperCollins). The critic and the cartoonist never crossed paths, so far as anyone knows. But there is some overlap between these publications, it seems to me. The biography of Charles M. Schulz, who died in 2000, calls to mind Wilson’s The Wound and the Bow, a collection of essays published in 1941 and reprinted in the second of the two Library of America volumes.

The connection is indirect but insistent. In the essay that lent The Wound and the Bow its title, Wilson revisits one of the lesser-known plays by Sophocles -- a telling of the story of Philoctetes, who also appears in the Iliad. Philoctetes is a skilled and powerful archer, but he is also a man in exile through no fault of his own. A snakebite has left him with a wound that not only festers but reeks. Unable to bear the stench or his groans, the Greeks abandon him on a desert island. And there he stays until Odysseus is forced to bring him back into service as the only man able to bend the bow of Heracles.

Wilson (who had started using psychoanalysis as a means of interpreting literary works well before this was required by law) saw in the figure of Philoctetes something like an allegorical emblem for the artist’s inner life. Neurosis is the agonizing wound that leaves the sufferer isolated and bitter, while genius is the ability to bend the bow, to do what others cannot. Creativity and psychic pain, “like strength and mutilation,” as Wilson put it, “may be inextricably bound up together."

Not such a novel idea, after all this time. And one prone to abuse -- reducing artistic creativity to symptomatology. (Or, worse, elevating symptomatology into art: a phenomenon some of us first encounter while dating.)

In Wilson’s hands, though, it was a way through the labyrinth of a writer’s work, of finding hidden passages within it. The two longest studies in The Wound and the Bow were interpretations of Charles Dickens and Rudyard Kipling: two authors whose critical reputations had been nearly done in by their commercial success. Wilson’s criticism, while biographical in method, did not take the debunking route. If he documented the wound, he also showed the strength with which each figure could draw the bow.

Now, I’m not really sure that the archer serves all that well as a model of the artist. (The myths of Daedelus or Orpheus work better, for a variety of reasons, and cover much of the same analogical ground.) On the other hand, Philoctetes did tend to complain a lot -- as did Charles Schulz, it seems. The cartoonist emerges from his biographer’s pages as a man of numerous griefs and grievances. His life was shaped by an upbringing that was economically secure but emotionally complex. His childhood was spent among among relatives who expressed affection through joking insults (to give things the most positive construction possible).

Michaelis, who has also written about the life of the painter N.C. Wyeth, offers numerous well-framed appreciations of Schulz’s artistry. The book is Wilsonian, in that sense. But any revaluation of “Peanuts” as cultural artifact is bound to be less a topic for conversation than the unveiling of details about his melancholia and his resentments.

An episode of the documentary series "American Masters" on PBS airing later this month will be tied to the book, which should reach stores any day now. Soon it will be common knowledge that everyone who met the cartoonist’s first wife had a pretty good idea where Lucy originated. Numerous “Peanuts” strips are embedded throughout the book -- each of them echoing events or situations in Schulz’s life or some aspect of his personality and relationships. (Members of his family are complaining about the biography, a development to be expected.)

The cartoons themselves -- however telling as illustrations of things the biographer has discovered about Schulz -- are rich works in their own right. They fall somewhere between art and literature; but those categories really don't matter very much, because they create their own little world. The biography derives its meaning from the cartoons and not vice versa.

So in an effort to restore some balance, I’d like to recommend some supplementary reading about “Peanuts” -- an essay that says very little about Schulz himself. It focuses instead on what he created. How an artist becomes capable of bending the bow is difficult to understand. Biography is one approach, but it does not exhaust the topic. (In a way it only begins to pose the riddle.)

The piece in question is “The World of Charlie Brown” by Umberto Eco. It appeared in his collection Apocalittica e integrati, a volume that became rather notorious when it first appeared in 1964. Parts of the collection were translated, along with some later pieces, as Apocalypse Postponed (Indiana University Press, 1994)

Like other essays in the book, the analysis of “Peanuts” is part of Eco’s challenge to familiar arguments about “mass culture,” whether framed in Marxist or conservative terms. Either way, the theorists who wrote about the topic tended to be denunciatory. Eco, who was 32 when Apocalittica appeared, had published a couple of monographs on medieval intellectual history and was also working on semiotics and the philosophy of language. Aside from teaching, he paid the bills by working for a television network and a trade publisher. All the quasi-sociological hand-wringing about the media struck Eco as rather obtuse, and he did not hesitate to say so.

From the vantage point of someone who had written about the aesthetic theory of Thomas Aquinus, it was not self-evident that “mass culture” was the fresh horror that worried his contemporaries. He saw it beginning with the cathedrals -- or at least no later than the printing press. The fact that Eco wrote about Superman and television worried some of the reviewers.

One of them complained that treating “Plato and Elvis Presley” as both “equally worthy of consideration” was bound to have grave consequences: “In a few years the majority of Italian intellectuals will be producing films, songs, and comic strips....while in the university chairs, young dons will be analyzing the phenomena of mass culture.” It would be the closing of the Italian mind, I guess.

“The World of Charlie Brown” is evidence that Eco meant to do more than stir up argument. It originally appeared as the preface to the first collection of Schulz’s strips to appear in Italy. It is the work of a critic determined to win “Peanuts” a hearing as a serious work of art.

Eco seems unable to resist a certain amount of elitist chain-yanking. He says that the translators lavished on their work “the meticulous passion that Max Brod devoted to the manuscripts of Kafka...and Father Van Breda to the shorthand notes of Edmund Husserl.” The round-headed Charlie Brown embodies “a moment of the Universal Consciousness,” he writes, “the suburban Philoctetes of the paperbacks.” (I confess that I did not remember that part of the essay until rereading it just now.)

But the tongue soon comes out of his cheek. Eco reveals himself as a devoted student of the history of the American comic strip. He triangulates “Peanuts” with respect to Jules Feiffer’s satirical cartoons and “the lyric vein of Krazy Kat” -- comparisons that are so brilliantly apt that they immediately seem obvious, which they aren’t.

And Eco warns the Italian reader that appreciating the strip involves learning Schulz’s rhythm of theme and variation. “You must thoroughly understand the characters and the situations,” he writes, “for the grace, tenderness, and laughter are born only from the infinitely shifting repetition of the patterns....”

At this point, it is tempting to quote at length from Eco’s quick analysis of the essence of Schulz's characters essence. Each one embodies or resists some part of the human condition -- even, and perhaps especially, Snoopy.

In the world of “Peanuts,” writes Eco, “we find everything: Freud, mass-cult, digest culture, frustrated struggle for success, craving for affection, loneliness, passive acquiescence, and neurotic protest. But all these elements do not blossom directly, as we know them, from the mouths of a group of children: they are conceived and spoken after passing through the filter of innocence.” The strip is “a little human comedy for the innocent reader and for the sophisticated.” A child can enjoy them, and so can the reader who is tempted to draw analogies to Samuel Beckett.

The sophisticated part of Eco’s sensibility can recognize in Schulz’s art a depth that is full of shadows: “These children affect us because in a certain sense they are monsters: they are the monstrous infantile reductions of all the neuroses of a modern citizen of industrial civilization.” But the depths aren’t an abyss. The little monsters, while sometimes cruel, never become unspeakable. They “are capable suddenly of an innocence and a sincerity which calls everything into question....”

Charles Schulz was a neurotic, no doubt; but most neurotics aren’t Charles Schulz. He was something else. And it may be that we need an Italian semiotician to remind us just what: "If poetry means the capacity of carrying tenderness, pity, [and] wickedness to moments of extreme transparence, as if things passed through a light and there were no telling any more what substance they are made of,” as Eco wrote, “then Schulz is a poet.”

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs each week. Suggestions and ideas for future columns are welcome.

Talkin 'bout Their Generation

Everyone knows that rock and roll is all about kicking out the jams: ditching uptight squares, taking long rides in the dark of night, and being a street fightin' man -- or woman. As The Who put it, it's about hoping to die before you get old.

But what does rock mean to a new generation of uptight (if updated and wired) squares, afraid of the open road, who have little fight in them? What does rock mean for a generation that has never been allowed to be young -- let alone hope to die before they get old?

For my students, the answer is simple. Rock and roll is about family happiness.

I discovered this disturbing undercurrent of rock-as-the-soundtrack-of-familial-bliss when I began teaching a college writing class this semester. The undergraduates' first assignment was to assess the personal meaning of any song of any genre. I was willing to wade bravely through the melancholy emo, the raging gangsta rap, the whiny indie rock, or even contemporary pop schlock in order to achieve my real agenda: a glimpse into the soul of my students, the inner world of their desire locked in their shiny iPods.

What I read in those papers was as unsettling and unfamiliar as the day Elvis shook it on the Ed Sullivan show -- but hardly as exciting. For my students, rock and roll is not the aural fuel of rebellion but soundtrack of familial love and safety. The essays were not about chillin' with the crew but hangin' with mom and dad; and they were not about cruising into the mystery of the night, but heading off to Cape Cod in the mini van. Rock is no longer about alienation but connection; not about escape but home; not about rebellion but reconciliation. Even bands like Led Zeppelin and The Stones emerged from my students papers in an un-purple haze of family nostalgia.

Turns out that for my elite students -- en route to becoming sharp suits and clever corporate cogs -- rock and roll is simply one more element in the finishing process of becoming just like the folks. Roll over Bob Dylan and tell Norman Rockwell the news. Jack Black's character in School of Rock had to teach his anxious and repressed grade schoolers what he knew viscerally: that the purpose of rock and roll is "Sticking it to The Man." Given that most of my students want to become "the man" (in whatever gender the icon of power might come in today), it's certainly not about sticking it to 'em.

Truth be told, many of these essays pulled at my fatherly heart strings, but I am mostly disturbed by them. I am haunted by the fact that perhaps their parents are so scarred by their own years of boomer alienation that they now feel compelled to crush any sense of rebellion with the weight of a generation's love, coddling friendship, and smothering safety. I could be wrong, but it seems that there ought to be at least an edge of disdain for the SUV-driving, suburban-dwelling, vanilla affluence of their parents, but instead, students remain hopelessly connected to them, not just by their ubiquitous cell phones but also by their parents' record collections.

The collateral damage here has little to do with contemporary debates about politics in the classroom and everything to do with students' ability to live life freely and creatively. There are glimmers of hope, but they're only glimmers. One particularly sharp student trailed me back to the office after an intense discussion about the "authentic" in Bob Dylan’s work. "Why," he asked longingly, "don’t we learn more about this in college?" Honoring the sincerity of his quest, I resisted the retort, "Because you're supposed to be talking about this with your buddies in the dorm."

Ah, you say, but this is the hip hop generation, so why should I worry about rock and roll? Despite explicitly opening the assignment up to any genre, few of my students chose to write about rap, which I found astonishing. Their commitment to most hip hop (except for the lonely black student from Detroit) was very thin and interwoven with ambivalence. Rap simply seems to be what's out there. They know the genre's prime has passed, that the heart has been taken out of it by the record industry.

At the same time, white indie rock has been devoid of soul and blues influences -- drained of the alchemical lifeblood created in the synthesis of white and black musical traditions. Indie is left with a whiny, trebly, irresolute sound that seems to fit the dull green glow of a computer screen in darkened suburban bedrooms. Music today is just another part of the price of America's re-segregation.

My own kids' strange connection to Dylan and the Clash at the tender ages of 7 and 10 suggest that I may be well on my way toward being part of the problem. Am I screwing them up by not adequately screwing them up, softly indoctrinating them into the glory days of rock and roll over family brunch on Sunday? Will they learn about the backbeat of power and rebellion at the displays of the Rock and Roll Hall of Fame instead of the more illicit places they ought to be receiving such education?

Of course, the most famous momma's boy of them all was the king of rock n roll himself, Elvis Presley, and in that fact there is home for the youth of America. But that was before cool had become one of the official anchors of consumer capitalism, before the commercialization of dissent had extended into every crevice of American culture. If the reason "Why Johnny Can't Dissent," as Tom Frank put it, is the commodification of resistance, maybe it's also why Johnny doesn't know his rock from his rebellion.

Author/s: 
Jefferson Cowie
Author's email: 
info@insidehighered.com

Jefferson Cowie is an associate professor at Cornell University.

'The Great Debaters': A Challenge to Higher Education

“The banquet of my Wiley years was the tutelage of Tolson.”
-- James Farmer, Lay Bare the Heart

Over the holidays, many may have gone to the theater to see The Great Debaters, the major motion picture from Denzel Washington and Oprah Winfrey. The film tells the extraordinary tale of the 1935 Wiley College debate team, its legendary coach Melvin B. Tolson and his most famous student, Dr. James L. Farmer Jr. One of the “Big Four” leaders of the Civil Rights Movement, Farmer put his debate training to use as the architect of the movement’s strategy of non-violent protest and direct action.

Most of the attention lavished on the movie has focused on how it helps audiences reflect on the ways in which racism permeates society. But the film also creates an opportunity for – and poses a challenge to – colleges and universities to provide all students with the fundamental academic experience that is debate. In addition to offering audiences opportunities to reflect on the ways in which racism permeates human society, the film challenges colleges and universities to provide all students with the fundamental academic experience that is debate. At a time when higher education is simultaneously financially constrained and seemingly awash in projects to create centers of excellence (teaching, civic engagement, service learning, and deliberative democracy) The Great Debaters reminds us that academic debate is a proven investment in the core values of our institutional missions.

Washington, who both directs and stars in the film, has taken the lead by donating $1 million to reestablish the Wiley College team, which lapsed after Tolson’s departure from the school. Washington’s generosity is a testament to his belief in the power and virtue of a debate education and a wake-up call to institutions of higher education to make academic debate a part of any serious strategic plan.

We all value the skills of argument and critical thinking; intercollegiate debate teaches these – and much more. Indeed, there is no better vehicle for stimulating undergraduate research, fostering tolerance and open mindedness, instigating engagement with the issues of the day, promoting understanding of global connections and inculcating the method of interdisciplinarity. Debate constitutes a series of connected academic experiences and teaches students to ask questions and seek answers to serious academic questions. Participation in debate, at any level, is life altering and has real consequence for students and their institutions alike. The skills, knowledge and habits of mind nurtured through academic debate are on display every day in virtually every profession, not the least of which is higher education.

A few years ago, John E. Sexton, president of New York University, said that his four years in high school debate “were the educational foundation of everything I did.” “I'm saying the finest education I got from any of the institutions I attended, the foundation of my mind that I got during those four years of competitive policy debate; that is, 90 percent of the intellectual capacity that I operate with today -- Fordham [University] for college, Fordham for the Ph.D., Harvard for law school -- all of that is the other 10 percent." But debate skills are not reserved only for exceptional students like Farmer and Sexton. All students should have the benefits of a debate education.

Because audiences around the globe will see The Great Debaters, it gives higher education a rare opportunity to promote this fundamental activity and garner support for it. How can we in higher education see this film, understand its message, and not return to our campuses to make those opportunities available to students? College administrators should be rushing to build strong debate programs in institutions where none presently exist. Meanwhile, universities that already have such programs should exercise a leadership role by committing to reinforce and showcase existing programs.

Compared to intercollegiate athletics and other costly endeavors, debate is, dollar for dollar, an efficient use of institutional resources. It requires no multimillion dollar complexes, playing fields, stadiums or expensive equipment. All that is necessary are classrooms, coaches, office supplies and support for travel and research. Debate is an inexpensive, educational and effective way to both promote schools and enhance the quality of the academic experience.

The movement to rediscover debate has already begun. Urban debate leagues at the middle and high school level are flourishing under the leadership of the National Association for Urban Debate Leagues and The Great Debaters will undoubtedly cause demand for debate to surge in the coming years. However, these leagues cannot shoulder the burden of a nationwide debate renaissance alone. They need colleges and universities to take a leadership role. Specifically, higher education must do three things.

First, we need to create viable opportunities for high school graduates who seek to continue their debate education after high school. Creating new programs and reinforcing existing programs is essential.

Second, and equally important, we must recruit, train and produce a new generation of professional debate educators. There are many middle and high schools around the country eager to offer debate opportunities to students, but they are unable to find qualified teachers with debate experience because the demand for quality coaches far outstrips the supply. To meet this shortfall, our institutions must generate capacity by fielding debate programs that give students opportunities to learn the coaching craft through rich individual learning experiences. In addition, thoughtful consideration should be given to the ways in which such a commitment spurs curricular innovation at both the undergraduate and graduate level as well as educational partnerships of local, regional and state constituencies. Finally, the creation of new opportunities to join the debate teaching fraternity must move in lockstep with efforts to retain, reward, and renew our best debate teachers.

Third, as the nation’s longstanding incubators of free expression, innovative thinking, democratic deliberation and social change, college and universities must do more to promote the role of debate as a necessary component of a well functioning society. Strong debate programs are essential because they showcase best practices. Debate programs are and should be key players in efforts to foster civic engagement and democratic responsibility.

The Great Debaters reminds us that the values of debate are the values of the academy itself. Even critics will admit that debate’s insufficiencies are due as much as anything to insufficient institutional commitment to a debate education. To be true to our core values, we need to promote the activities that create better students and better citizens. Debate does this. An America where academic debate becomes a prominent fixture on every campus would be a better America. Every college and university has many James Farmers strolling the hallways and quadrangles of its campuses; but we must lay the foundation for achievement. There will be no better opportunity to bring this to fruition than the one that now lies before us . The time for debate is now.

Author/s: 
Timothy M. O'Donnell
Author's email: 
newsroom@insidehighered.com

Timothy M. O’Donnell is chair of the National Debate Tournament Committee and director of debate at the University of Mary Washington, in Fredericksburg, Va.

'Telling' Hard Truths About War

With American society divided for and against the war, returning veterans tend to be viewed more as issues than as individuals. Recent news media coverage has focused on stories about soldiers suffering from post-traumatic stress disorder who have become violent criminals and on the trials of wounded vets who receive substandard medical treatment. Unquestionably, these are important issues. However, with the Iraq War entering its sixth year and the White House indicating that troop levels will remain at 130,000 for an indeterminate time, facilitating the return of the “average Joe” soldier is an increasingly pressing issue that remains largely ignored.

Since the adoption of the GI Bill during the Second World War, colleges and universities, like the one where I teach, have served as primary gateways through which many vets have found a path back into civilian life. Yet campuses today tend to have visible and vocal anti-war segments among their faculty and students. Ironically, in the post-Vietnam era the GI Bill, a tool designed to facilitate reintegration, places student-vets in environments that many find unwelcoming at best, exclusionary at worst.

As a pacifist, I want to see an end to the Iraq War, the sooner the better. As a citizen, I feel guilty that this desire is my sole contribution.As a result, I don’t know how to engage, how to approach the increasing number of returning vets I encounter in my day-to-day life, inside the classroom and out. When a friend, the University of Oregon administrator Jonathan Wei, told me about an innovative play being performed by student-veterans there, I was immediately intrigued.

Eugene, often referred to as the “Berkeley of Oregon,” has been described as “famously anti-war.” Bumper stickers denouncing the war are ubiquitous, and the words “the War” are commonly graffitied onto stop signs. For Oregon student-vets, feelings of estrangement and isolation were common. Many described to Wei feeling “invisible” and being “unable to connect with friends” upon their return from service. One, a Korean-American woman who had been deployed to Guantanamo, summed up her experience this way: “I just had to keep to myself, keep my head down, go to class, come home. Honest to god, it was like me having to pretend I wasn’t Asian.”

To confront the disconnection that so many felt, first the UO student-vets organized, forming the Veterans and Family Student Association (VFSA). Next, they created a play.

The project began after Wei, in his capacity as coordinator of nontraditional student programs, staged several panel discussions with the 20-odd members of the newly formed veteran students’ group during the 2006-7 academic year. From the meetings, Wei began to see the limits inherent in approaching veterans as a demographic or political issue. He encouraged the association to help the university’s Women's Center (another group he worked with) stage a production of Eve Ensler’s “Vagina Monologues.” Inspired by the experience, Wei and the veterans’ group began work on their own original play.

From the start, “Telling” was about the communal process of creation as much as it was about the eventual product itself. Wei and Max Rayneard, a South African Fulbright scholar and Ph.D. candidate in comparative literature, interviewed 21 VFSA members during the summer and fall of 2007. Wei and Rayneard used the transcripts to write the text. John Schmor, the head of Oregon’s department of theatre arts and a self-described “Prius-driving Obama sticker guy,” signed on to direct as soon as Wei approached him with the idea. For most of the student-vets, “Telling” would be their first time on stage. To prepare them, Schmor offered a performance class during the fall semester geared especially toward the production.

Two hundred eighty-five people crammed into the Veterans Memorial Building auditorium near the campus on February 7 for opening night of the three-show run. This was 45 more than expected; another 40 had to be turned away. Three hundred attended the second performance, 245 the third. Among them were current military personnel and veterans, young men with close-cropped hair and “old timers,” grizzled and graying; UO students, some even younger; a spattering of university faculty; university, city and state officials, including representatives from U.S. Sen. Ron Wyden’s and Rep. Peter DeFazio’s offices; and townsfolk of all stripes.

“It was a mix of people like I’ve never seen at a production in Eugene,” said Schmor, who has been involved with theater in the city since his time as a graduate student here, in 1988.

I attended the first show. The play’s success stemmed from the connection it created between performer and audience. We, in the audience, sat close to the bare stage and close to one another. “Telling” mixed monologues and blocked scenes that described enlistment and boot camp, deployment, and return to civilian life, giving the student-vets a voice they would not typically otherwise have. The multi-racial cast of 11 included three women and eight men. Nine were former soldiers, sailors, airmen and marines, one a recruit, and one a military wife. They played themselves as well as the recruiters, drill sergeants and fellow soldiers that characterized their various experiences in the armed forces.

Watching the student-vets act out their experiences allowed me to reconsider my oftentimes sensational and conflicted impressions of the military. For me, the performance transformed the people on stage from “veterans” to individuals with goals and dreams not so unlike those of nearly every student I teach. They were boyfriends and girlfriends, brothers and sisters, history majors and student-teachers, wanna-be musicians and Peace Corps aspirants. Yet they were also young men and women who have had extraordinary experiences in the name of service, young people whose stories we, as a community, need to hear, no matter how difficult it is to do so.

Watching, I felt energized, edified and also entertained, as the performers were really funny. I, along with those around me, frequently burst into explosive laughter. There were also many audible sighs. When it was done, we all gave the vets a standing ovation.

I wasn’t alone in being moved. Activists from the peace vigil that has held weekly protests outside Eugene’s Federal Building since the Iraq War began attended opening night. Exiting the auditorium, one enthusiastically said to another, “By the end I really came to love them.”

For the VFSA, a significant goal, beyond initiating community dialogue, was outreach -- to make other vets aware of the organization. On that score, the play was also successful, as membership in the organization continues to grow at an unusually high rate, from the original 20 to over 75 since the play’s February run. (The university estimates there to be about 400 veterans on campus, though the actual number is impossible to verify, as only those on the GI Bill have to identify themselves.)

Another goal of the veteran students’ group was to strive for greater exposure, and beyond Eugene, “Telling” has generated immediate buzz. Since the play’s opening, several colleges and universities around the state have contacted the association to have “Telling” performed on their campuses, as have a handful of veterans’ organizations. This has resulted in a scheduled five-venue tour for this coming summer. Likewise, at the Student Veterans of America Conference, hosted by Vetjobs and the Illinois Department of Veterans Affairs in Chicago, the VFSA was adopted as a national model for organizations across the country looking into the issue of veterans on campuses.

Wei, who now lives in Austin, Texas, has begun working to make the project formula portable so that student-veteran groups nationwide can adapt “Telling” to their own memberships and communities. The process of interviews/script/performance requires specific local application to be most beneficial. While Eugene’s version might resonate with Austin audiences, for instance, it will only truly do the work of reconnecting vets and communities if a University of Texas version is produced, with UT student-vets speaking to individuals from their own community.

Given the UO success, this is an outcome worth aspiring to.

Author/s: 
David Wright
Author's email: 
newsroom@insidehighered.com

David Wright, author of Fire on the Beach: Recovering the Lost Story of Richard Etheridge and the Pea Island Lifesavers (Scribner 2001), teaches at the University of Illinois at Urbana-Champaign.

Towards Helhaven

"WALL-E," the latest animated production from Pixar Studios, is a heartwarming children’s film about ecological disaster. Its title character is a sturdy little trash-compacting robot whose name is the abbreviation for Waste Allocation Load-Lifter, Earth-class. He has been programmed to clear the vast junkpile left behind by mankind, which has long since absconded to live on a space station. His only companion -- at least as the film begins -- is a cockroach. Through plot developments it would spoil things to describe, WALL-E is transported to the human colony in deep space. In eight hundred years, it seems, our civilization will be a fusion of Wal-Mart, Club Med, and the World Wide Web.

Lots of kids will get their first taste of social satire from this film -- and chances are, they are going to enjoy it. Yet there is more to what Pixar has done than that. Some of the images are breathtaking. It turns out that robots have their romantic side, or at least WALL-E does; and the sight of him rescuing mementos from the wreckage (fragments shored up amidst human ruin) is perhaps more touching than the love story that later emerges.

I had heard almost nothing about the film before attending, so was not at all prepared for a strange surprise: It kept reminding me of Kenneth Burke’s writings about a grim future world he called Helhaven.

Burke, who died 15 years ago at the age of 96, was a poet, novelist, and critic who belonged to a cohort of modernist writers that included Hart Crane, Djuna Barnes, and William Carlos Williams. His name is not exactly a household word. It does not seem very likely that anyone at Pixar was counting on someone in the audience thinking, “Hey, this is a little bit like the essays that Kenneth Burke published in a couple of literary magazines in the early 1970s.” And I sure don’t mean to start an intellectual-property lawsuit here. The margin of overlap between Pixar and KB (as admirers tend to call him) is not a matter of direct influence. Rather, it’s a matter of each drawing out the most worrying implications of the way we live now.

Burke’s fiction and poetry tend to be overlooked by chroniclers of American literary history. But his experimental novel Towards a Better Life has exercised a strong influence on other writers -- especially Ralph Ellison, whose Invisible Man was deeply shaped by it. He also had a knack for being in interesting places at the right time. For example, he discovered and made the first English translation of Thomas Mann’s Death in Venice; and in the course of his day job as editor for The Dial, Burke helped prepare for its initial American publication a poem called “The Wasteland,” by one T.S. Eliot.

By the early 1930s, his occasional writings on aesthetic questions began to give shape to an increasingly systematic effort to analyze the full range of what Burke called “symbolic action,” a term that subsumed the entire range of human culture. His books were all over the disciplinary map -- part philosophy, part sociology, dashes of anthropology, plus elements from literature in various languages thrown in for good measure -- all tied together through his own idiosyncratic idioms.

Alas, given the vagaries of translation, Burke seems to have gone largely unnoticed by his theoretical peers in Europe; but it is fair to say that Burke’s method of “dramatism” is a kind of rough-hewn Yankee structuralism. His later speculations on “logology” have certain semi-Lacanian implications, even though KB was unaware of the French psychoanalyst’s work until very late in the game.

Along the way, Burke seems to have pioneered something that has only been given a name in more recent decades: the field of ecocriticism. In a book from 1937 called Attitudes Towards History, he noted that, among the recently emerging fields of study, “there is one little fellow called Ecology, and in time we shall pay him more attention.”

Burke often used the first-person plural -- so it is easy to read this as saying he meant to get back to the subject eventually. But his wording also implied that everyone would need to do so, sooner or later. Ecology teaches us “that the total economy of the planet cannot be guided by an efficient rationale of exploitation alone,” wrote Burke more than 70 years ago, “but that the exploiting part must eventually suffer if it too greatly disturbs the balance of the whole.”

In the early 1970s, Burke returned to this theme in a couple of texts that now seem more prophetic than ever. The Helhaven writings first appeared in The Sewanee Review and The Michigan Quarterly Review, and have been reprinted in the posthumous collection On Human Nature: A Gathering While Everything Flows, 1967-1984, published five years ago by the University of California Press.

The Helhaven writings -- a blend of science fiction and critical theory, with some of KB’s own poetry mixed in -- fall outside the familiar categories for labeling either creative or scholarly prose. In them, Burke imagined a future in which everyone who could escape from Earth did, relocating to a new, paradise-like home on the lunar surface he called Helhaven. The name was a pun combining “haven,” “heaven,” and “hell.”

The immediate context for Burke’s vision bears remembering: The Apollo missions were in progress, the first Earth Day was celebrated in 1970, and the release of the Pentagon Papers was making “technocratic rationality” sound like an oxymoron. And comments in the Helhaven writings make it clear all of these circumstances were on the author’s mind.

But just as important, it seems, was Burke’s realization that American life had completely trumped his previous effort to satirize it. At the very start of the Great Depression, Burke published a Jonathan Swift-like essay in The New Republic calling for his fellow citizens to destroy more of their natural resources. This was, he wrote, the key to prosperity. The old Protestant ethic of self-control and delayed gratification was a brake on the economy. “For though there is a limit to what a man can use,” he wrote, “there is no limit to what he can waste. The amount of production possible to a properly wasteful society is thus seen to be enormous.”

And if garbage was was good, war was better. “If people simply won’t throw things out fast enough to create new needs in keeping with the increased output under improved methods of manufacture,” suggested Burke, “we can always have recourse to the still more thoroughgoing wastage of war. An intelligently managed war can leave whole nations to be rebuilt, thus providing peak productivity for millions of the surviving population.”

Not everyone understood that Burke’s tongue was in cheek. A newspaper columnist expressed outrage, and the letters of indignation came pouring in. Burke’s editor at The New Republic told him that this invariably happened with satire. Some readers always took it seriously and got mad.

Four decades later, though, Burke saw an even greater problem. The joking recommendation he made in the 1930s to stimulate the economy via waste was, by the 1970s, an policy known as “planned obsolescence.” The idea of war as economic stimulus package evidently has its enthusiasts, too.

Furthermore, Burke now thought that the wasteful imperative was subsumed under what he called hypertechnologism -- the tendency for technology to develop its own momentum, and to reshape the world on its own terms. We had created machines to control and transform nature. But now they were controlling and transforming us. Our desires and attitudes tended to be the products of the latest innovations, rather than vice versa. (And to think that Burke died well before the rise of today’s market in consumer electronics.)

This wasn’t just a function of the economic system. It seemed to be part of the unfolding of our destiny as human beings. Borrowing a term from Aristotle, Burke referred to it as a manifestation of entelechy -- the tendency of a potential to realize itself. “Once human genius got implemented, or channelized, in terms of technological proliferation,” wrote Burke in 1974, “how [could we] turn back? Spontaneously what men hope for is more. And what realistic politician could ever hope to win on a platform that promised less?”

We were in “a self-perpetuating cycle,” he mused, “quite beyond our ability to adopt any major reforms in our ways of doing things.” Besides, failure to trust in progress is un-American. And so Burke tried to carry his speculations to their most extreme conclusion.

Suppose a beautiful lake were being turned into a chemical waste dump. Why try to figure out how to fix it? “That would be to turn back,” wrote Burke,” and we must fare ever forward. Hence with your eyes fixed on the beacon of the future, rather ask yourselves how, if you but polluted the lake ten times as much, you might convert it into some new source of energy ... a new fuel.”

By further extrapolation, Burke proposed letting the whole planet turn into a vast toxic cesspool as we built a new home -- a “gigantic womb-like Culture-Bubble, as it were” -- on the moon. The beautiful landscapes of Old Earth could be simulated on gigantic screens. Presumably there would be artificial gravity. Everything natural could be simulated by purely technological means.

We would have to take occasional trips back to be replenished by “the placenta of the Mother Earth,” our source for raw materials. Or rather, polluted materials. (Scientists on Helhaven would need to figure out how to purify them for human use.) Burke imagines a chapel on the lunar surface with telescopes pointed towards the Earth, with a passage from the Summa Theologica of Thomas Aquinas inscribed on the wall: “And the blessed in heaven shall look upon the sufferings of the damned, that they may love their blessedness more.”

The Helhaven writings seem darker -- and, well, battier -- than "WALL-E." Burke’s late work can get awfully wild, woolly, and self-referential; and these texts are a case in point. His imaginative streak is constantly disrupted by his theoretical glossolalia. He can barely sketch an image before his critical intelligence interrupts to begin picking it apart. The Helhaven texts, as such, can only appeal to someone already preoccupied with Burke's whole body of thought. You won't ever find in them the charm of watching a little robot struggle with a ping-pong paddle-ball.

But the similarities between KB’s perspective and that of the Pixar film are more striking than the differences. Both are warnings -- in each case, with a clear implication that the warning may have come much too late. For the point of such visions is not to picture how things might turn out. The planet-wide trash dump is not part of the future. Nor is the culture-bubble to be found in outer space. They are closer to us than that.

“Think of the many places in our country where the local drinking water is on the swill side, distastefully chlorinated, with traces of various contaminants,” he wrote almost four decades ago. “If, instead of putting up with that, you invest in bottled springwater, to that extent and by the same token you are already infused with the spirit of Helhaven. Even now, the kingdom of Helhaven is within you.”

Aquafina or Deer Park, anyone?

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Arts
Back to Top