Arts

'Telling' Hard Truths About War

With American society divided for and against the war, returning veterans tend to be viewed more as issues than as individuals. Recent news media coverage has focused on stories about soldiers suffering from post-traumatic stress disorder who have become violent criminals and on the trials of wounded vets who receive substandard medical treatment. Unquestionably, these are important issues. However, with the Iraq War entering its sixth year and the White House indicating that troop levels will remain at 130,000 for an indeterminate time, facilitating the return of the “average Joe” soldier is an increasingly pressing issue that remains largely ignored.

Since the adoption of the GI Bill during the Second World War, colleges and universities, like the one where I teach, have served as primary gateways through which many vets have found a path back into civilian life. Yet campuses today tend to have visible and vocal anti-war segments among their faculty and students. Ironically, in the post-Vietnam era the GI Bill, a tool designed to facilitate reintegration, places student-vets in environments that many find unwelcoming at best, exclusionary at worst.

As a pacifist, I want to see an end to the Iraq War, the sooner the better. As a citizen, I feel guilty that this desire is my sole contribution.As a result, I don’t know how to engage, how to approach the increasing number of returning vets I encounter in my day-to-day life, inside the classroom and out. When a friend, the University of Oregon administrator Jonathan Wei, told me about an innovative play being performed by student-veterans there, I was immediately intrigued.

Eugene, often referred to as the “Berkeley of Oregon,” has been described as “famously anti-war.” Bumper stickers denouncing the war are ubiquitous, and the words “the War” are commonly graffitied onto stop signs. For Oregon student-vets, feelings of estrangement and isolation were common. Many described to Wei feeling “invisible” and being “unable to connect with friends” upon their return from service. One, a Korean-American woman who had been deployed to Guantanamo, summed up her experience this way: “I just had to keep to myself, keep my head down, go to class, come home. Honest to god, it was like me having to pretend I wasn’t Asian.”

To confront the disconnection that so many felt, first the UO student-vets organized, forming the Veterans and Family Student Association (VFSA). Next, they created a play.

The project began after Wei, in his capacity as coordinator of nontraditional student programs, staged several panel discussions with the 20-odd members of the newly formed veteran students’ group during the 2006-7 academic year. From the meetings, Wei began to see the limits inherent in approaching veterans as a demographic or political issue. He encouraged the association to help the university’s Women's Center (another group he worked with) stage a production of Eve Ensler’s “Vagina Monologues.” Inspired by the experience, Wei and the veterans’ group began work on their own original play.

From the start, “Telling” was about the communal process of creation as much as it was about the eventual product itself. Wei and Max Rayneard, a South African Fulbright scholar and Ph.D. candidate in comparative literature, interviewed 21 VFSA members during the summer and fall of 2007. Wei and Rayneard used the transcripts to write the text. John Schmor, the head of Oregon’s department of theatre arts and a self-described “Prius-driving Obama sticker guy,” signed on to direct as soon as Wei approached him with the idea. For most of the student-vets, “Telling” would be their first time on stage. To prepare them, Schmor offered a performance class during the fall semester geared especially toward the production.

Two hundred eighty-five people crammed into the Veterans Memorial Building auditorium near the campus on February 7 for opening night of the three-show run. This was 45 more than expected; another 40 had to be turned away. Three hundred attended the second performance, 245 the third. Among them were current military personnel and veterans, young men with close-cropped hair and “old timers,” grizzled and graying; UO students, some even younger; a spattering of university faculty; university, city and state officials, including representatives from U.S. Sen. Ron Wyden’s and Rep. Peter DeFazio’s offices; and townsfolk of all stripes.

“It was a mix of people like I’ve never seen at a production in Eugene,” said Schmor, who has been involved with theater in the city since his time as a graduate student here, in 1988.

I attended the first show. The play’s success stemmed from the connection it created between performer and audience. We, in the audience, sat close to the bare stage and close to one another. “Telling” mixed monologues and blocked scenes that described enlistment and boot camp, deployment, and return to civilian life, giving the student-vets a voice they would not typically otherwise have. The multi-racial cast of 11 included three women and eight men. Nine were former soldiers, sailors, airmen and marines, one a recruit, and one a military wife. They played themselves as well as the recruiters, drill sergeants and fellow soldiers that characterized their various experiences in the armed forces.

Watching the student-vets act out their experiences allowed me to reconsider my oftentimes sensational and conflicted impressions of the military. For me, the performance transformed the people on stage from “veterans” to individuals with goals and dreams not so unlike those of nearly every student I teach. They were boyfriends and girlfriends, brothers and sisters, history majors and student-teachers, wanna-be musicians and Peace Corps aspirants. Yet they were also young men and women who have had extraordinary experiences in the name of service, young people whose stories we, as a community, need to hear, no matter how difficult it is to do so.

Watching, I felt energized, edified and also entertained, as the performers were really funny. I, along with those around me, frequently burst into explosive laughter. There were also many audible sighs. When it was done, we all gave the vets a standing ovation.

I wasn’t alone in being moved. Activists from the peace vigil that has held weekly protests outside Eugene’s Federal Building since the Iraq War began attended opening night. Exiting the auditorium, one enthusiastically said to another, “By the end I really came to love them.”

For the VFSA, a significant goal, beyond initiating community dialogue, was outreach -- to make other vets aware of the organization. On that score, the play was also successful, as membership in the organization continues to grow at an unusually high rate, from the original 20 to over 75 since the play’s February run. (The university estimates there to be about 400 veterans on campus, though the actual number is impossible to verify, as only those on the GI Bill have to identify themselves.)

Another goal of the veteran students’ group was to strive for greater exposure, and beyond Eugene, “Telling” has generated immediate buzz. Since the play’s opening, several colleges and universities around the state have contacted the association to have “Telling” performed on their campuses, as have a handful of veterans’ organizations. This has resulted in a scheduled five-venue tour for this coming summer. Likewise, at the Student Veterans of America Conference, hosted by Vetjobs and the Illinois Department of Veterans Affairs in Chicago, the VFSA was adopted as a national model for organizations across the country looking into the issue of veterans on campuses.

Wei, who now lives in Austin, Texas, has begun working to make the project formula portable so that student-veteran groups nationwide can adapt “Telling” to their own memberships and communities. The process of interviews/script/performance requires specific local application to be most beneficial. While Eugene’s version might resonate with Austin audiences, for instance, it will only truly do the work of reconnecting vets and communities if a University of Texas version is produced, with UT student-vets speaking to individuals from their own community.

Given the UO success, this is an outcome worth aspiring to.

Author/s: 
David Wright
Author's email: 
newsroom@insidehighered.com

David Wright, author of Fire on the Beach: Recovering the Lost Story of Richard Etheridge and the Pea Island Lifesavers (Scribner 2001), teaches at the University of Illinois at Urbana-Champaign.

Towards Helhaven

"WALL-E," the latest animated production from Pixar Studios, is a heartwarming children’s film about ecological disaster. Its title character is a sturdy little trash-compacting robot whose name is the abbreviation for Waste Allocation Load-Lifter, Earth-class. He has been programmed to clear the vast junkpile left behind by mankind, which has long since absconded to live on a space station. His only companion -- at least as the film begins -- is a cockroach. Through plot developments it would spoil things to describe, WALL-E is transported to the human colony in deep space. In eight hundred years, it seems, our civilization will be a fusion of Wal-Mart, Club Med, and the World Wide Web.

Lots of kids will get their first taste of social satire from this film -- and chances are, they are going to enjoy it. Yet there is more to what Pixar has done than that. Some of the images are breathtaking. It turns out that robots have their romantic side, or at least WALL-E does; and the sight of him rescuing mementos from the wreckage (fragments shored up amidst human ruin) is perhaps more touching than the love story that later emerges.

I had heard almost nothing about the film before attending, so was not at all prepared for a strange surprise: It kept reminding me of Kenneth Burke’s writings about a grim future world he called Helhaven.

Burke, who died 15 years ago at the age of 96, was a poet, novelist, and critic who belonged to a cohort of modernist writers that included Hart Crane, Djuna Barnes, and William Carlos Williams. His name is not exactly a household word. It does not seem very likely that anyone at Pixar was counting on someone in the audience thinking, “Hey, this is a little bit like the essays that Kenneth Burke published in a couple of literary magazines in the early 1970s.” And I sure don’t mean to start an intellectual-property lawsuit here. The margin of overlap between Pixar and KB (as admirers tend to call him) is not a matter of direct influence. Rather, it’s a matter of each drawing out the most worrying implications of the way we live now.

Burke’s fiction and poetry tend to be overlooked by chroniclers of American literary history. But his experimental novel Towards a Better Life has exercised a strong influence on other writers -- especially Ralph Ellison, whose Invisible Man was deeply shaped by it. He also had a knack for being in interesting places at the right time. For example, he discovered and made the first English translation of Thomas Mann’s Death in Venice; and in the course of his day job as editor for The Dial, Burke helped prepare for its initial American publication a poem called “The Wasteland,” by one T.S. Eliot.

By the early 1930s, his occasional writings on aesthetic questions began to give shape to an increasingly systematic effort to analyze the full range of what Burke called “symbolic action,” a term that subsumed the entire range of human culture. His books were all over the disciplinary map -- part philosophy, part sociology, dashes of anthropology, plus elements from literature in various languages thrown in for good measure -- all tied together through his own idiosyncratic idioms.

Alas, given the vagaries of translation, Burke seems to have gone largely unnoticed by his theoretical peers in Europe; but it is fair to say that Burke’s method of “dramatism” is a kind of rough-hewn Yankee structuralism. His later speculations on “logology” have certain semi-Lacanian implications, even though KB was unaware of the French psychoanalyst’s work until very late in the game.

Along the way, Burke seems to have pioneered something that has only been given a name in more recent decades: the field of ecocriticism. In a book from 1937 called Attitudes Towards History, he noted that, among the recently emerging fields of study, “there is one little fellow called Ecology, and in time we shall pay him more attention.”

Burke often used the first-person plural -- so it is easy to read this as saying he meant to get back to the subject eventually. But his wording also implied that everyone would need to do so, sooner or later. Ecology teaches us “that the total economy of the planet cannot be guided by an efficient rationale of exploitation alone,” wrote Burke more than 70 years ago, “but that the exploiting part must eventually suffer if it too greatly disturbs the balance of the whole.”

In the early 1970s, Burke returned to this theme in a couple of texts that now seem more prophetic than ever. The Helhaven writings first appeared in The Sewanee Review and The Michigan Quarterly Review, and have been reprinted in the posthumous collection On Human Nature: A Gathering While Everything Flows, 1967-1984, published five years ago by the University of California Press.

The Helhaven writings -- a blend of science fiction and critical theory, with some of KB’s own poetry mixed in -- fall outside the familiar categories for labeling either creative or scholarly prose. In them, Burke imagined a future in which everyone who could escape from Earth did, relocating to a new, paradise-like home on the lunar surface he called Helhaven. The name was a pun combining “haven,” “heaven,” and “hell.”

The immediate context for Burke’s vision bears remembering: The Apollo missions were in progress, the first Earth Day was celebrated in 1970, and the release of the Pentagon Papers was making “technocratic rationality” sound like an oxymoron. And comments in the Helhaven writings make it clear all of these circumstances were on the author’s mind.

But just as important, it seems, was Burke’s realization that American life had completely trumped his previous effort to satirize it. At the very start of the Great Depression, Burke published a Jonathan Swift-like essay in The New Republic calling for his fellow citizens to destroy more of their natural resources. This was, he wrote, the key to prosperity. The old Protestant ethic of self-control and delayed gratification was a brake on the economy. “For though there is a limit to what a man can use,” he wrote, “there is no limit to what he can waste. The amount of production possible to a properly wasteful society is thus seen to be enormous.”

And if garbage was was good, war was better. “If people simply won’t throw things out fast enough to create new needs in keeping with the increased output under improved methods of manufacture,” suggested Burke, “we can always have recourse to the still more thoroughgoing wastage of war. An intelligently managed war can leave whole nations to be rebuilt, thus providing peak productivity for millions of the surviving population.”

Not everyone understood that Burke’s tongue was in cheek. A newspaper columnist expressed outrage, and the letters of indignation came pouring in. Burke’s editor at The New Republic told him that this invariably happened with satire. Some readers always took it seriously and got mad.

Four decades later, though, Burke saw an even greater problem. The joking recommendation he made in the 1930s to stimulate the economy via waste was, by the 1970s, an policy known as “planned obsolescence.” The idea of war as economic stimulus package evidently has its enthusiasts, too.

Furthermore, Burke now thought that the wasteful imperative was subsumed under what he called hypertechnologism -- the tendency for technology to develop its own momentum, and to reshape the world on its own terms. We had created machines to control and transform nature. But now they were controlling and transforming us. Our desires and attitudes tended to be the products of the latest innovations, rather than vice versa. (And to think that Burke died well before the rise of today’s market in consumer electronics.)

This wasn’t just a function of the economic system. It seemed to be part of the unfolding of our destiny as human beings. Borrowing a term from Aristotle, Burke referred to it as a manifestation of entelechy -- the tendency of a potential to realize itself. “Once human genius got implemented, or channelized, in terms of technological proliferation,” wrote Burke in 1974, “how [could we] turn back? Spontaneously what men hope for is more. And what realistic politician could ever hope to win on a platform that promised less?”

We were in “a self-perpetuating cycle,” he mused, “quite beyond our ability to adopt any major reforms in our ways of doing things.” Besides, failure to trust in progress is un-American. And so Burke tried to carry his speculations to their most extreme conclusion.

Suppose a beautiful lake were being turned into a chemical waste dump. Why try to figure out how to fix it? “That would be to turn back,” wrote Burke,” and we must fare ever forward. Hence with your eyes fixed on the beacon of the future, rather ask yourselves how, if you but polluted the lake ten times as much, you might convert it into some new source of energy ... a new fuel.”

By further extrapolation, Burke proposed letting the whole planet turn into a vast toxic cesspool as we built a new home -- a “gigantic womb-like Culture-Bubble, as it were” -- on the moon. The beautiful landscapes of Old Earth could be simulated on gigantic screens. Presumably there would be artificial gravity. Everything natural could be simulated by purely technological means.

We would have to take occasional trips back to be replenished by “the placenta of the Mother Earth,” our source for raw materials. Or rather, polluted materials. (Scientists on Helhaven would need to figure out how to purify them for human use.) Burke imagines a chapel on the lunar surface with telescopes pointed towards the Earth, with a passage from the Summa Theologica of Thomas Aquinas inscribed on the wall: “And the blessed in heaven shall look upon the sufferings of the damned, that they may love their blessedness more.”

The Helhaven writings seem darker -- and, well, battier -- than "WALL-E." Burke’s late work can get awfully wild, woolly, and self-referential; and these texts are a case in point. His imaginative streak is constantly disrupted by his theoretical glossolalia. He can barely sketch an image before his critical intelligence interrupts to begin picking it apart. The Helhaven texts, as such, can only appeal to someone already preoccupied with Burke's whole body of thought. You won't ever find in them the charm of watching a little robot struggle with a ping-pong paddle-ball.

But the similarities between KB’s perspective and that of the Pixar film are more striking than the differences. Both are warnings -- in each case, with a clear implication that the warning may have come much too late. For the point of such visions is not to picture how things might turn out. The planet-wide trash dump is not part of the future. Nor is the culture-bubble to be found in outer space. They are closer to us than that.

“Think of the many places in our country where the local drinking water is on the swill side, distastefully chlorinated, with traces of various contaminants,” he wrote almost four decades ago. “If, instead of putting up with that, you invest in bottled springwater, to that extent and by the same token you are already infused with the spirit of Helhaven. Even now, the kingdom of Helhaven is within you.”

Aquafina or Deer Park, anyone?

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Moviegoer

Whatever happened to cinephilia? Does it still exist? I mean, in particular, the devotion of otherwise bookish souls to the screen. (The big screen, that is, not kind you are looking at now.) Do they still go to movies the way they once did? With anything like the passion, that is – the connoisseurship, the sheer appetite for seeing and comparing and discussing films?

I don’t think so. At least very few people that I know do. And certainly not in the way documented in Reborn (Farrar, Straus and Giroux) the recently published edition of Susan Sontag’s journals, which includes a few pages from a notebook listing the dozens of films the author attended over just three weeks in early 1961. An editorial comment provides more detail about Sontag’s record of her moviegoing that year: “On no occasion is there a break of more than four days between films seen; most often, SS notes having seen at least one, and not infrequently two or three per day.”

This was not just one writer’s personal quirk. It was clearly a generational phenomenon. In a memoir of his days as a student of philosophy at the Sorbonne in the late fifties and early sixties, the French political theorist Regis Debray describes how he and his friends would go from seminars to the cinema as often as their stipends allowed.

“We could afford to enjoy it several times a week,” he writes. “And that is not counting those crisis days when our satisfied and yet insatiable desire made us spend whole afternoons in its darkness. No sooner had we come out, scarcely had we left its embrace, our eyes still half-blind, than we would sit round a café table going over every detail.... Determinedly we discussed the montage, tracking shots, lighting, rhythms. There were directors, unknown to the wider public, whose names I have now forgotten, who let slip these passwords to the in-group of film enthusiasts. Are they still remembered, these names we went such distances to see? .... It may well be the case that our best and most sincere moments were those spent in front of the screen.”

Debray wrote this account of cinemania in the late sprint of 1967, while imprisoned in Bolivia following his capture by the military. He had gone there on a mission to see Che Guevara. An actor bearing a striking resemblance to the young Debray appears in the second part of Stephen Soderberg’s Che, now in theaters.

That passage from his Prison Writings (published by Random House in the early 1970s and long out of print; some university press might want to look into this) came to mind on a recent weekday afternoon.

After a marathon course of reading for several days, I was sick of print, let alone of writing, and had snuck off to see Soderberg’s film while it was still in the theater, on the assumption that it would lose something on the video screen. There was mild guilt: a feeling that, after all, I really ought to be doing some work. Debray ended up feeling a bit of guilt as well. Between trips to the cinema and arguing over concepts in Louis Althusser’s classroom, he found himself craving a more immediate sense of life – which was, in part, how he ended in the jungles of Bolivia, and then in its prisons.

Be that as it may, there was something appealing about this recollection of his younger self, which he composed at the ripe old age of 26. The same spirit comes through in the early pages of Richard Brody's Everything is Cinema: The Working Life of Jean-Luc Godard (Metropolitan Books) and now a finalist for one of the National Book Critics Circle awards. Brody evokes the world of cinema clubs in Paris that Godard fell into after dropping out of school – from which there emerged a clique of Left Bank intellectuals (including Francois Truffaut, Claude Chabrol, and Eric Rohmer) who first wrote critical essays on film for small magazines and then began directing their own.

They got their education by way of mania – which was communicable: Debray and Sontag were examples of writers who caught it from the New Wave directors. Another would be the novelist, poet, and linguist Pier Paolo Pasolini, who also started making films in the early sixties.

It’s not clear who the contemporary equivalents would be. In the mid-1990s you heard a lot about how Quentin Tarantino had worked in a video store and immersed himself in the history of film in much the same way that the French directors had. But the resemblance is limited at best. Godard engaged in a sustained (if oblique) dialogue with literature and philosophy in his films -- while Tarantino seems to have acquired a formidable command of cinematic technique without ever having anything resembling a thought in his head. Apart, of course, from “violence is cool,” which doesn’t really count.

These stray musings come via my own reading and limited experience. They are impressions, nothing more – and I put them down in full awareness that others may know better.My own sense of cinephilia's decline may reflect the fact that all of the movie theaters in my neighborhood (there used to be six within about a 15 minute walk) have gone out of business over the past ten years.

But over the same period cable television, Netflix, and the Internet have made it easier to see films than ever before. It is not that hard to get access to even fairly obscure work now. Coming across descriptions of Godard’s pre-Breathless short films, I found that they were readily available via YouTube. And while Godard ended up committing a good deal of petty crime to fund those early exercises, few aspiring directors would need to do so now: the tools for moviemaking are readily available.

So have I just gotten trapped (imprisoned, like Debray in Bolivia) by secondhand nostalgia? It wouldn't be the first time. Is cinephilia actually alive and well? Is there an underground renaissance, an alternative scene of digital cine clubs that I’m just not hearing about? Are you framing shots to take your mind off grad school or the job market? It would be good to think so -- to imagine a new New Wave, about to break.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pete Seeger, America's Teacher

The January concert at the Lincoln Memorial celebrating the inauguration of President Barack Obama offered many stirring moments, but perhaps its highlight was Pete Seeger leading a chorus of hundreds of thousands of people singing "This Land Is Your Land." This is where Americans expect to see Pete Seeger, raising his voice for change, even when it’s cold outside.

Seeger has been singing folk music for change for more than 70 years now, sometimes in the middle of storms, sometimes causing them. Defiantly leftist, pacifist, and for a decade or so, Communist, Seeger has embraced almost every major reformist cause of the 20th century. He’s sung and spoken out for organized labor, against McCarthyism, in support of Civil Rights, against the Vietnam War, and -- from the deck of the sloop Clearwater, a “ship of song” which he helped to build -- his voice put early wind into the sails of the environmental movement.

Now 89, Seeger has witnessed his own transformation into an icon. President Clinton bestowed the National Medal of the Arts on him in 1994, and The Library of Congress named him a “Living Legend” in 2000. In 2007 PBS released "Pete Seeger: The Power of Song" as part of its American Masters documentary series. Seeger’s 90th birthday concert in New York on May 3rd will be a star-studded affair featuring luminaries from Ani DiFranco to Bruce Springsteen.

One of the words you hear applied most often to Pete Seeger these days is “genius.” A 2005 album even proclaims him a “Genius of Folk.” But instead we should think of Pete Seeger as America’s teacher, an “inconvenient artist” (as Clinton put it) who taught the conflicts before other people got to them.

As the energetic teacher to a nation, Seeger has let his songs do the instruction. He’s the author of evergreens like “If I had a Hammer” and “Turn, Turn, Turn,” and as a member of the Weavers he helped bring folk music into the commercial mainstream before Seeger and the rest of the group were blacklisted during the Red-baiting fifties. But Pete Seeger is perhaps best known as a walking, talking American songbook whose encyclopedic contents are accompanied by his banjo or guitar—and by the voices of his audience. Springsteen, who in 2006 made a memorably vital album of songs called "The Seeger Sessions," says that, “Pete’s library is so vast that the entire history of the country is there.”

Like Benjamin Franklin before him, Pete Seeger has long sought to lead an exemplary life. Woody Guthrie, whose friendship with the teenaged Seeger in the late 1930s became one of the formative events in the young man’s artistic development, was amazed that Seeger didn’t drink, smoke, or chase women. Some of Seeger’s asceticism may have come from the New England culture he was brought up in, but it’s also clear that Seeger consciously chose to live a certain way, and that his principled choices inform his entire adulthood.

And so when the time came for Seeger to face the Communist-hunting House Un-American Activities Committee (HUAC) in 1955, he refused to discuss his politics or his associations. He didn’t plead the Fifth, though. Instead, he took the First. “Using the Fifth Amendment,” Seeger explained, “is in effect saying, ‘you have no right to ask me this question’; but using the First Amendment means in effect, ‘you have no right to ask any American such questions.' ” This courageous gesture resulted in a conviction for Contempt of Congress that kept Seeger suspended, a hair from jail, for nearly seven years before it was tossed out on a technicality in 1962.

***

Pete Seeger has been a teacher to three generations of my family. I'm in the middle one of the three, and my memorable Pete Seeger moments include the 1980 reunion of the Weavers at Carnegie Hall, a final appearance by the group shortly before one of its mainstays, Lee Hays, died. (That reunion is the subject of the excellent 1982 documentary "Wasn’t That a Time.")

The Weavers, a quartet that featured the harmonies of Seeger, Hays, Ronnie Gilbert and Fred Hellerman, achieved unprecedented commercial success for a folk music group during the early fifties, selling millions of records. Seeger reacted ambivalently to his sudden immersion in the mainstream, sometimes wearing red socks with his tuxedos.

The anti-Communist blacklist cut down the Weavers in the middle of their hit parade. Banned first from television and then from theaters and clubs, the group disbanded in 1952. They defied the blacklist to reunite in 1955, but Seeger left the group in 1957 to pay more attention to his solo career. The 1980 reunion was the first appearance together of the original Weavers lineup since the early 1960s. I’ve attended some joyful concerts in my life, but I’ve never seen an outpouring of love between audience and performers like that one.

These days I like getting my daughter, KC, into the same room with Pete Seeger whenever possible. My theory is that hanging around with incorruptible people is a character builder. KC’s first Pete Seeger concert was a 2007 benefit. Pete walked on stage that night after being introduced, and hundreds of people popped up to give him a standing ovation before he sang a note. I've been talking to KC (who was then 8) about that ovation in the months since, about how the audience was saying "Thank you for living your life the way that you have, and for making the choices that you did." I’ve suggested to her that getting an ovation like that is better than being rich, since you can't buy it. What better reward is there for a teacher?

Pete Seeger’s voice isn’t what it used to be, but he does a few songs, leads some singalongs, tells a few stories, visits with the folks. He played some songs that KC knows, including "This Land Is Your Land," which she sang along with delightedly. Someone requested "Old Dan Tucker," and he said, "You've been listening to Bruce Springsteen!" before he played it with a handful of extra verses that nobody but he and a handful of music historians know. He also led a singalong of "Somewhere Over the Rainbow," ending by insisting that the audience add two words to the last line even though he said that Yip Harburg, the song’s author, would have objected: "If birds fly over the rainbow, then why oh why can't you and I?" The change made my heart swell.

I've also been talking to KC about Pete Seeger's different causes. The soundtrack to our discussions—which have been mostly in the car, where she has less to distract her—has been his songs. She frequently requests Pete Seeger music now, especially the Weavers and the older stuff. She likes some of his antiwar music too, especially the controversial classic “Waist Deep in the Big Muddy.”

“Big Muddy” tells a story about a platoon during World War II whose obstinate captain ignores advice and leads them to the brink of disaster. The song’s transparently scabrous commentary on President Lyndon Johnson’s conduct of the Vietnam War led CBS to censor the song when Seeger first taped it as part of "The Smothers Brothers Comedy Hour" in 1967. But the resulting protests — including Seeger’s own warning that “the public should know that their airwaves are censored for ideas as well as sex” — led the network to back down and invite Seeger to sing “Big Muddy” on the show a second time. This time it was broadcast. The song sadly retains its topical resonance. Seeger rerecorded it last year for his most recent album, At 89, to protest the war in Iraq.

Union activism is the subject of most of KC’s favorite Pete Seeger songs. KC knows about unions in a general sense from hearing her mother, a union lawyer, talk about her work, but she's getting a full vocabulary now, since I'll hit the pause button to explain what a scab is, or how picket lines got their name.

The result is that my daughter has become the oddest of birds: a nine year-old Old Time Leftist. She sings "Which Side Are You On?" and "Solidarity Forever" as though she were marching herself. She loves "Talkin' Union" ("You can always tell a stool by the yaller streak runnin' down his back") and "Union Maid" ("Oh, you can't scare me, I'm stickin' to the union!"). I feel a strange mixture of pride and amusement when I hear KC singing those songs. Her labor choruses have made my mother very happy, for she got her politics from Pete Seeger. She grew up in a household where no one talked about such things, and when she started at Brooklyn College in the late forties, she attended Pete Seeger concerts on campus, where she learned from him. I grew up on Pete Seeger, Woody Guthrie, and Weavers records, and my extended family—from grandparents to grandchildren—will be attending Seeger’s coming birthday concert.

***

But there’s an issue that Pete Seeger missed: the women’s liberation movement of the 1960s and afterwards. One of the great songs Seeger popularized, for example, “Little Boxes,” indicts conformity, but only for men. “The boys go into business, and marry and raise a family,” goes the song. These men play golf and “drink their martini dry,” but the women in their lives are nowhere heard from. At a time when Betty Freidan was leading a charge against a different kind of barricade, Pete Seeger continued to sing about a default person who was always male, attended by an invisible female helpmeet. Some of those lyrics make me wince today — and when my daughter is around, they also make me reach for the pause button to explain.

Pete Seeger had a monumentally atypical career, but the way that he pursued it was typical of the men of his time. When he wanted to escape the restrictions of the blacklist by singing his way around the world, his wife Toshi dutifully pulled up stakes with their children, and accompanied him on a one-man peace, love, and understanding tour that encompassed over 30 countries. When Seeger wanted to bypass the television networks’ blacklist of him, he devised a PBS program called "Pete Seeger’s Rainbow Quest," which featured Pete and the guest of the week sitting around a kitchen table informally playing and talking about music. The show ran for 39 episodes in 1967. (Many of them are available on DVD, and are well worth checking out.) Toshi Seeger produced "Pete Seeger’s Rainbow Quest," but she’s listed in the credits as “Chief Cook and Bottle Washer.”

In the sunset of his epic life, Pete Seeger now proclaims the ways that his wife made that life possible. He touts Toshi’s contributions to their work, and repents the burden that he laid on her, a burden that she lovingly bore. Pete Seeger’s commitment to Toshi Seeger’s work underscores in a different way the credo by which he has lived his life: what he calls “participation.” Thus does Pete Seeger nourish his unshakable commitment to the communities around him.

Like the best teachers, he has always understood the value of learning — for himself as well as his students.

Author/s: 
Leonard Cassuto
Author's email: 
info@insidehighered.com

Leonard Cassuto is a professor of English at Fordham University. His book Hard-Boiled Sentimentality: The Secret History of American Crime Stories, has been nominated for a 2009 Edgar Award by the Mystery Writers of America.

Analyze This

On the evening of June 10, 2007, several million people watching "The Sopranos" experienced a moment of simultaneous bewilderment. During the final scene of the final episode of its final season (a montage in which the tension built up steadily from cut to cut) the screen went blank -- and the soundtrack, consisting of the Journey power ballad "Don't Stop Believing," had gone dead. The impending narrative climax never arrived. But neither was this an anticlimax, exactly; it did not seem to be related at all to the events taking place onscreen. Many viewers probably assumed it was a technical glitch.

Once the credits began rolling, any anger at the service provider was usually redirected to the program’s creators. The willing suspension of disbelief had been not so much broken as violated.The blank screen could be (and was) interpreted variously: as an indication that Tony Soprano was blown away by an assassin, perhaps, or as a gesture of hostility by David Chase (towards the audience, or HBO, or even the notion of closure itself).

But analysis was not payoff. The end remained frustrating. The Sopranos offered its viewers an aporia they couldn’t refuse.

As I write this column (scheduled to appear two years to the day after that final episode aired) the bibliography of academic commentary on "The Sopranos" runs to more than half a dozen volumes. That's not counting all the stray conference papers and scattered volumes with chapters on it, let alone the knickknack books offering Tony Soprano's management secrets.

Life being as short as it is, I have not kept up with the literature, but did recently pause in the middle of watching the third season to read the latest book-length commentary, The Sopranos by Dana Polan, a professor of cinema studies at New York University, published in March by Duke University Press.

His departmental affiliation notwithstanding, Polan’s analysis challenges the idea that "The Sopranos" was much more akin to film than to television programming.This is certainly one of the more familiar tropes in critical discussion of "The Sopranos," whether around the water cooler or in more formal settings. An associated line of thought identifies it with a tradition of “quality TV” -- as when a critic in The New York Times suggested that the series “is strangely like 'Brideshead Revisited,' 'The Singing Detective,' and 'I, Claudius.' ”

(The fact that Tony Soprano’s mother is named Livia certainly did seem like a nod to the latter show’s monstrous matriarch. At least one classicist has argued that the real-life Livia Drusilla of the first century was the victim of an unscrupulous smear campaign. I mention this for the convenience of anyone who wants to attempt a revisionist interpretation of Livia Soprano’s role. Good luck with that.)

Rather than going along with the familiar judgment that "The Sopranos" stood above and apart from the usual run of mass-cultural fare, Polan reads it as continuous with both the traditions of genre television and the hierarchy-scrambling protocols of the postmodern condition.

The thugs in Tony Soprano’s crew are familiar, obsessed even, with the Godfather films, and cite them constantly – a bit of intertextuality that left the audience constantly scrambling to find and extrapolate on allusions within the unfolding story. But Polan maintains that the show was structured at least as much by parallels to the old-fashioned situation comedy. Or rather, to the especially ironic variation on sitcom themes found in one program in particular, "The Simpsons."

“In this revised form,” writes Polan, “the job front is a complicated site lorded over by capricious and all-powerful bosses; the sons are slackers who would prefer to get in trouble or watch television than succeed at school; the daughter is a liberal and intellectually ambitious child who is dismayed by her father’s déclassé way of life and political incorrectness but who deep down loves him and looks for moments of conciliation; the wife is a homemaker who often searches for something meaningful to her existence and frequently tries to bring cultural or moral enrichment into the home; the bar is a male sanctuary; and there is an overall tone of postmodern fascination with citation and a general sense of today’s life as lived out in an immersion in popular culture and with behaviors frequently modeled on that culture.”

Someone posting at the New York Times blog Paper Cuts a few months ago took the entirely predictable route of charging the book with “taking all the fun out of our favorite unstable texts” by smearing jargon on slices of the show.

But surely I cannot be the only reader who will respond with a kind of wistful nostalgia to Polan’s recurrent, urgent insistence that postmodern irony is organizing principle of "The Sopranos."

The show “frustrates easy judgment,” he writes, “by incorporating a multiplicity of critical positions into the text so that it becomes unclear to what extent there is one overall moral or thematic attitude that governs the work.”

Man, that really takes me back. While "The Sopranos" itself premiered in 1999, this interpretation has something very 1989-ish about it.... The Berlin Wall was in ruins, and so were the metanarratives. Joe Isuzu was introducing a new generation to the liar's paradox. And it seemed like if you could just make your irony sufficiently ironic, brute contingency would never touch you. Those were "good" times.

Yet formally self-conscious and deliberately ambiguous though it tended to be, "The Sopranos" was by no means so completely decentered in its “overall moral or thematic attitude” as all that. On the contrary, it seems to me to have been very definitely grounded what might be called (for want of any better phrase) a deeply pessimistic Freudian moral sensibility.

That label may sound almost oxymoronic to most people. We tend to think of Freud’s work as a negation of moralism: an attempt to liberate the individual from the excessive demands of the social order. But his view of the world was a far cry from that of the therapeutic culture that took shape in his wake. He was skeptical about about how much insight most patients could ever achieve -- let alone the benefits following from the effort. The mass of humanity, Freud said, was “riffraff.” The best the analyst could hope for was to cure the client of enough “neurotic misery” to be able to deal with “ordinary human unhappiness.”

A regular consumer of new therapeutic commodities like Tony’s sister Janice Soprano may expect to get some profound and satisfying self-transformation for her money. But the original psychoanalytic perspective was far more dubious. Freud also had misgivings about how his work would be received in the United States. While approaching by ship in 1909 (this year marks the centennial of his lectures at Clark University), Freud took exception to Jung’s remark that they were bringing enlightenment to the New World. No, said Freud, their ship was delivering the plague.

Indeed, someone like Tony Soprano entering treatment would have been one of the old doctor’s worst nightmares about the fate of his work. The question of Dr. Melfi’s willingness to continue treating Tony (not simply the danger this presents to her, but the moral puzzle of what “improvement” would even mean in the case of a sociopath) runs throughout the series.

When Carmela Soprano decides to seek therapy, she is referred to an old immigrant analyst named Dr. Krakower who refuses to indulge her belief that Tony is fundamentally decent. This is, of course, something the viewers, too, have been encouraged to believe from time to time -- in spite of seeing it disproved in one brutal encounter after another.

“Many patients want to be excused for their current predicament,” says Dr. Krakower, “because of events that occurred in their childhood. That's what psychiatry has become in America. Visit any shopping mall or ethnic pride parade, and witness the results.” He then refuses to accept payment from Carmela, or to continue treatment, until she breaks with Tony: “I'm not charging you because I won't take blood money, and you can't, either. One thing you can never say is that you haven't been told.”

Dr. Krakower then disappears from the show. A present absence, so to speak. We, the viewers, have by that point had numerous reminders that we are deriving vicarious pleasure from seeing how Tony and his crew earn the blood money that Dr. Krakower won't touch. We have been given a very clear indication of the difference between complicity and some version of the Freudian moral stance.

The deep pessimism of that outlook comes through time and again as we see how powerful are the psychic undercurrents within the family. Far from it being “unclear to what extent there is one overall moral or thematic attitude that governs the work,” we are on a terrain of almost Victorian naturalism, in which rare moments of insight are no match for the blind play of urges that define each character.

Take, again, the example of New Age gangster moll Janice Soprano. In his book, Polan notes that she “keeps hooking up with the dysfunctional and violent heads of Mafia crews within Tony’s jurisdiction.” In spite of everything, she never learns from her mistakes.

Polan treats this as an example of “the amnesia plot” – a sly, pomo-ironic wink, perhaps, at all those times on "Gilligan’s Island" when somebody got hit on the head with a coconut.

But surely some other interpretation is possible. Outside the play of televisual signifiers, there are people who, in one crucial area or other of their lives, never learn a damned thing – or if they do, it still makes no difference, because they make the same mistakes each time a fresh opportunity presents itself. This is, perhaps, the essence of Freud’s distinction between neurotic misery and normal unhappiness.

Not that the old misogynist necessarily gives us the key to understanding Janice Soprano. But her behavior, cycling through its compulsions in spite of various therapists and gurus, is consistent with Freud’s grimmer estimates of human nature.

The virtual impossibility of changing one’s life (even when staying alive depends on it) was also the lesson of the gay mobster Vito Spatagfore’s trajectory during the final season. Having fled both the closet and his murderously homophobic peers, Vito has every reason to settle down to an idyllic life in New Hampshire, where he has both a low-carbohydrate diet and a studly fireman boyfriend.

But Vito feels compelled to return to New Jersey and his old way of life, with predictable results. It all plays out like something inspired by Beyond the Pleasure Principle, in which Freud’s speculations on the repetition compulsion lead him to the concept of thanatos, the death drive.

When the screen went blank two years ago, it was, among other things, a disruption of our daydream-like engrossment in the world of the Sopranos. It was a sharp, even brutal reminder that the viewer had made an investment in Tony's life. The audience was left frustrated: we wanted him to either escape the consequences of his actions or get killed. Neither motive is exactly creditable, but daydreams often manifest truths we'd rather disavow.

Polan’s book is often insightful about the visual dimension of The Sopranos, if a bit reductive about treating its self-consciousness as generically postmodern. The program’s long shadow, he writes, “tells us something serious about the workings of popular culture in the media economies of today. Irony sells, and that matters.”

We all make different meanings out of the raw materials provided by any given cultural artifact – so in the spirit of hermeneutic laissez faire, I won’t quibble. But the realization that "irony sells" does not exhaust the show's meaning. It seems, rather, like something one of the brighter gangsters might say.

For this viewer's part, at least, the lesson of "The Sopranos" is rather different: Life is over soon enough, and it is not full of second chances – even though we tend to expect them. (We often prove really good at kidding ourselves about how many chances there are.) Be as ironic about life as you want; it doesn’t help. You end up just as dead.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Plug-In Syllabus

Whether or not it still makes sense to call PBS “educational television” (some of us are still bitter about “Yanni at the Acropolis”) but there was once a time when didacticism was indeed its mandate. And for my part, it’s impossible not to think of KERA, the network’s affiliate in Dallas, as an alma mater of sorts.

KERA had the distinction of being the station that introduced American viewers to "Monty Python’s Flying Circus." In a more serious vein of absurdism (but still with some humor) it also broadcast “Waiting for Godot” in the late 1970s, which sent my teenage existentialism up to a new plateau. It also used to show, quite frequently in fact, "The Naked Civil Servant," which can’t have met with approval from the Southern Baptist Convention.

I remember watching a documentary about Salvador Dali at least two or three times – mind suitably boggled by its clips from Un Chien Andalou. And late at night on the weekends there was a program called "One Star Theater," a home for low-budget movies that were surrealist by default. In one of them, as I recall, the human survivors of some disaster were attacked by giant shrews, played by dogs dressed in shrew costumes. (Even calling them “costumes” may be an overstatement.)

Exposure to Samuel Beckett, art-appreciation documentaries, "Masterpiece Theatre," and grade Z film gave me the rudiments of an aesthetic education. And a good thing, too, because nobody in the local school system would have used the expression “aesthetic education,” or considered it worth offering.

But my TV curriculum was broader still. There were dueling series on economics hosted by Milton Friedman and John Kenneth Galbraith. I seem to recall a program where Henry Steele Commager talked about Alexis de Tocqueville’s Democracy in America at some length. And on each episode of a series called "Connections," the wry host, James Burke, covered the interaction of technology and culture by tracing improbable chains of cause and effect down the course of four or five centuries.

There was Dick Cavett’s program, which has migrated from network to network over the decades but called PBS home from 1977 to 1982. On it, Allen Ginsberg answered questions (sort of) and tried to sing (this was just bearable), and Truman Capote mumbled through the barbiturates. Susan Sontag stopped by to radiate the dark glamor that lets you get away with anything. Other guests talked about their films and books, and gave a glimpse of whatever serious adults in New York were serious about, back then. It was always a revelation.

Meanwhile there was "Firing Line," where William F. Buckley made conservative arguments against his guests without yelling at them. Evidently his approach was too subtle even by the standards of the day. One of my classmates was sure that Buckley must be a liberal because of the way he talked – the accent, the polysyllables, the sneer. (Not to mention the way his tongue darted out from time to time, like that of a Komodo dragon about to devour a goat.) I explained that Buckley was in fact an ardent supporter of not-yet-president Ronald Reagan. My friend decided that he would try "Firing Line" again.

Politics aside, the show was good for the vocabulary. I probably owed my National Merit Scholarship to William F. Buckley.

KERA's programming tended (apart from "One Star Theater") to be earnestly and even aggressively middlebrow in spirit.Just what happened to that spirit over the next few years is a puzzling question, and can't be divorced from the issue of what happened to the cultural apparatus at large.

Many of the changes were structural, which is another way of saying that they involved money. It was not just that funding for public broadcasting was always being trimmed and challenged. At the same time, new television channels were created by the scores and then the hundreds. Some of the fare that had distinguished educational broadcasting (pop history, talking-heads shows, book chat) was now found elsewhere, spread broadly throughout the cable universe. Which in some ways meant more thinly: the audience dispersed across the dial, the critical mass of nerd concentration harder to reach.

At the same time, PBS itself had -- in the interest of ratings and continuing support -- to take on more and more programming with no didactic intent at all: soporific smooth jazz, antiques shows, concerts in which Pavarotti and Sting joined forces, etc.

Now, to be honest, I was not paying attention while most of these changes were taking place. Educational broadcasting had done its work only too well. I spent the 1980s reading Husserl and whatnot. When the impulse to watch TV kicked in, it involved a craving for something to cool down the brain -- including late-night reruns of shows my teenage self would have scorned with all due high-mindedness.

One of my professors had commented in passing that “Love Boat” was an example of Bakhtinian carnival, albeit in debased form. This sounded plausible. By day, I thought about the epoché’s suspension of judgment regarding the ontological status of the objects within experience. At two in the morning, I suspended all judgment whatsoever and went slumming on the tube. It was a license to consume garbage.

Which all just goes to show that bildung can take some strange turns. But over the past decade or so, I started to think back on my debt to PBS in its starchier and more strenuously uplifting era -- and started to miss it.

A program like “Meeting of the Minds” -- in which Steve Allen sat at a table with actors dressed up like famous artists, scientists, and leaders throughout history -- is something you outgrow, sooner or later. But at the right age, it gives you something that enables you to outgrow it. I’m not persuaded that even the most rigorous semiotic approach to Aaron Spelling’s oeuvre will yield anything like that benefit.

But now the point is moot, right? Public broadcasting has been on blood-thinning drugs for a long time. Even the categories of highbrow, middlebrow, and lowbrow are quaint. Ten years ago John Seabrook coined the term “nobrow” to describe the prevalent cultural mode; it still seems applicable. And you can’t go back to the old didacticism because nobody knows how to pull it off anymore.

So I assumed, anyway, until coming across an interesting development at the website of my alma mater, KERA.

It offered a podcast covering an exhibit at the student gallery of the University of Texas at Arlington devoted to Fluxus – an international avant garde cultural movement from the early 1960s, inspired by Dadaism, but also looking ahead to conceptual and performance art. Another recent segment there (this one available on video) features an interview with the director Philip Haas about his work on a series of film installations at the Kimbell Art Museum in Fort Worth.

At first blush, this seems like a flashback to what was available on KERA 30 years ago – solid and informative, enjoyable in its way but also downright educational. At the same time, the fact that it is available on the internet gives it a much broader potential audience. The story on “Fluxus in Texas,” for example, has drawn comments from Paris and Brussels.

So what’s happening? How is it that old-school cultural earnestness has been revived in a new-media environment? We’ll look into that with next week’s column.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Brandeis Wasn't Wrong

In 2001 I donated my collection of prints by sculptors to the Block Museum of Art at Northwestern, though some of the prints still adorn the walls of my house and won’t get to Evanston until after my death. You can assume -- and you would be right -- that a collector of such works has been a lifetime “consumer” and supporter of the arts.

And yet, I said to myself “good for them” when reports first surfaced last winter that Brandeis intended to sell its collection of modern art, so that the considerable (envisaged) proceeds could support functions closer to the central goals of the university.

Understand that my print collection went to Northwestern because I had been dean of arts and sciences there for thirteen years. Understand also that regarding this issue, my experience as dean trumps my love of art and that is why I disagree with the views expressed in numerous articles in The New York Times and one this month in Inside Higher Ed called “Avoiding the Next Brandeis."

I see a significant role for art museums on higher education campuses. But, with quite special exceptions, I see a very small pedagogic function for colleges and universities to own works of art, especially given the current cost and value of so many of them. I’d rather those museums were reclassified as galleries. To be sure, the provisions of deeds of gift must be scrupulously observed; but assuming that to be the case, let them sell their works of art if the funds thus gained will better serve the institutions’ educational mission.

The premise here is that the roles of museums on campuses are not like those of museums downtown, since the former exist to serve the specific needs and interests of a campus’s students and faculty.

This month’s article in Inside Higher Ed quotes a task force formed by arts groups to figure out ways to avoid the next Brandeis as saying that campus museums should be regarded as “essential to the academic experience and to the entire educational enterprise.”

But why should they be so regarded when, by my admittedly not systematic observations, most of those museums do nothing or very little to deserve to be so regarded? As dean, I had to bludgeon the Block Gallery to present an exhibit of the work of Northwestern’s prize painters, William Conger, Ed Paschke and James Valerio. (This was before the Gallery was transformed into a Museum and long before its current director, David Robertson, came to Northwestern.) Art history departments are mostly held at arm's length by campus museums who prize their (inappropriate) autonomy. Mostly, the museums don’t even know how to communicate with other than art faculty on campus.

It is excellent, therefore, that this cluster of issues is being looked at. In my view, however, the goals sought by the task force for campus art museums are not likely to be realized by means of works of arts owned by museums, but rather by means of exhibits brought in and often locally curated for specific pedagogic purposes.

Members of the task force, make sure, therefore, that you are not just talking to yourselves. You are looking for ways to relate A to B; there must thus be strong representation from both poles. As announced, the organizations participating in the task force are mostly from the Category A: the art museum community.

I strongly recommend that it also include not only representation from the art history and studio art departments, but knowledgeable people who have thoughts about how to involve art museums in educating students who are not primarily concerned with the arts. Indeed, given the way in which so many campus museums lead existences so separate from their campus surroundings, it might even be necessary to initiate reflection about about their possible wider functions. The task force might want to consider forming a committee consisting of a couple of department chairperson, a couple of deans or associate deans, perhaps some interested students assigning them the task of reporting to the museum-powers-that-be how those museums might serve a broad campus constituency.

Accordingly, if the just-formed task force keeps its eye on the ball (as I see it), that Brandeis bomb will have very positive, if unintended, consequences.

Author/s: 
Rudolph H. Weingartner
Author's email: 
newsroom@insidehighered.com

Rudolph H. Weingartner is former dean of arts and sciences at Northwestern University.

The Museum at the Heart of the Academy

When I first learned last winter that the Brandeis University trustees had voted to sell the collections of the Rose Art Museum and close the museum, my reactions were many: concern for Brandeis students who were losing an important learning tool; sadness that a great university was breaking trust with many benefactors; annoyance that the museum industry would be yet again living the trauma of defending our collections as other than semi-liquid assets.

To these were added a suspicion that somehow the Rose must have failed in its campus-wide engagement and in its outreach to key campus constituencies (including its trustees), or those very trustees would never have felt they could make that particular decision, no matter how great the budget gap facing them. Even as I recognized the right of any university to shutter any program no longer deemed sufficiently important, I shuddered at such a reactive decision.

Nowhere in my response did I consider the “good for them” proclamation made by Rudolph Weingartner in his Views column of October 23 for Inside Higher Ed, arguing against both the pedagogic value of owning works of art and the effectiveness of university museums generally. Most troublingly, in reading of the view of a panel of experts arguing that university museums should be regarded as “essential to the academic experience," Dean Weingartner observes “by my admittedly not systematic observations, most of those museums do nothing or very little to deserve to be so regarded…. Mostly, the museums don’t even know how to communicate with other than art faculty on campus.”

Drawing such conclusions -- and a kind of pleasure in the demise of a fine museum -- on the basis of random evidence seems not to represent the rigors of academic policy making at its best.

More dangerously, this view fails to note either the sheer range and variety of campus museums in the United States or the extent to which many have worked mightily in recent decades to make themselves central to their parent institutions. Long gone are the days when most university museums could be seen as, at best, the laboratory addendum to a department of art or art history. Seeking not merely (although importantly) to shape future art historians and museum professionals, the nation’s best university museums have long been engaged in the practice of fostering critical thinking and visual literacy, the understanding of times and cultures dramatically distinct from our own, the awareness of a common humanity, and thus, ultimately, the shaping of good citizenship.

Here at Princeton University, we have long crossed boundaries to partner our museum with disciplines and departments from the humanities to creative writing to architecture to civil engineering. The Yale Center for British Art routinely connects with fields ranging from natural history to cultural studies; their exhibition this year on the impact of Darwin’s theory of evolution on subsequent creative practice was a model for cutting-edge investigation of value to us all.

The Wolfsonian Museum at Florida International University offered an almost shockingly timely exhibition looking at the art of propaganda during last year’s presidential campaign. The new wing opened this year at the University of Michigan Museum of Art -- which I led until recently -- was designed to architecturally embody and make possible a commitment to deep campus-wide engagement, providing a second home for programming in performance, creative writing, film, and the humanistic disciplines generally.

And many universities increasingly use their art museums in medical curriculums, having discovered that sustained close looking makes their doctors-in-training better diagnosticians. From Dartmouth, to Emory, to Wisconsin, to UCLA, great university museums have shown themselves deeply capable of being essential to the lives of their universities, even as they also often function as enormously beneficial gateways to those universities for the general public.

The argument that academic museums can do these things is no mere abstraction. They are doing these things, and are increasingly recognized as playing an essential role in a time of bottom-line driven programming at many of even our greatest civic museums. With less at stake in the battle for attendance, the university museum can often take on difficult projects whose popularity cannot be assured, advancing the cause of new knowledge presented in accessible ways that yet seek to avoid pandering or the much dreaded “dumbing down." Many of the first thematic exhibitions -- sometimes operating in the sphere of a social history of art, the so-called “new art history” -- took place in our university museums. Increasingly, and happily, the special role of the university museum is recognized by the media: Writing in The New York Times this year, the art critic Holland Cotter observed “The august public museum gave us fabulousness. The tucked away university gallery gave us life: organic, intimate and as fresh as news.”

And why do we university museums so annoyingly feel the need to collect artworks, creating the inevitable drain on resources caused by those pesky stewardship requirements? I offer in answer a fundamental article of faith, that even in the digital age, the sustained engagement with original works of art necessary for teaching, research, and layered learning would be difficult if not impossible if we ceased to be collecting institutions and instead taught only from objects temporarily made available for exhibition.

In the way that great texts live in our libraries, available for revisiting and sustained scholarly investigation, the works of art in our museums offer the possibility of deep critical engagement, close looking, and technical analysis -- made all the deeper when brought together as collections in which dialogues arise through the conversation of objects with each other and with their scholarly interlocutors. Surely a key role of the academy -- the advancement of new knowledge and the challenging of past knowledge -- is that fruit of curatorial, faculty, and student research made possible by the sustained presence of great works of art, whose survival for the future is also thus (and not incidentally) guaranteed.

Like libraries that often also find themselves embattled in times of budget cuts (since typically neither museums nor libraries directly generate tuition streams), great university art museums are a “public good," offering value and possibility to the whole of our university communities as well as to users from outside the walls of the ivory tower. That all university museums do not achieve this centrality of purpose -- often, I suspect, for lack of adequate resourcing by their parent institutions in the perpetual fight against the perception that art represents a “luxury” in the logo-and data-centric university -- is to be regretted. Without question much work remains to be done to make our museums central to the academic experience.

But just as any academic department desires a certain autonomy to define its foci and particular strengths within the university curriculum, no academic museum should be “bludgeoned” into showing the work of particular artists or serving as the handmaiden of narrow administrative modishness. The academic model has never, thank heavens, been one of pure utility, even as we seek to be responsible, effective, and impactful.

For me, the lesson of the Brandeis debacle is the reminder that the fight for the central role of our museums is not won. Contrary to Dean Weingartner’s views, however, that fight has long and often successfully been underway.

Author/s: 
James Christen Steward
Author's email: 
newsroom@insidehighered.com

James Christen Steward is director of the Princeton University Art Museum.

George Clooney Meets Max Weber

Spoiler alert: Max Weber’s life is an open book, thanks in part to Joachim Radkau’s wonderful new 700-page biography, so nothing to spoil there. But this essay does reveal the ending of Jason Rietman’s new film.

Thoughtful, intellectual movies are produced each year in the United States and abroad -- open texts rich with meaning, understood by critics or not. Some writers and directors begin with a premise, others stumble into one, and still others capture the zeitgeist and hit a chord, even if we cannot articulate precisely what it is. For me, not much of a moviegoer and certainly not a film critic, Up in the Air, the highly-acclaimed new movie directed by Jason Reitman (he also directed Juno), and written with Sheldon Turner, resonates powerfully with some of my challenging student conversations of late.

There are no ground-breaking paradigms about human nature introduced in Up in the Air, just as we’ve not seen many of those in academic circles in recent times. But by trying to keep us engaged, Reitman manages to come face to face with the very best of 19th and early 20th century philosophy and sociology. It was during this period that the great theorists of industrialization and technology emerged with force – Marx of course, then Max Weber, Ferdinand Tönnies, and Emile Durkheim among others – exploring the relationships among rationality, morality, community, and the acceleration of technological change in all aspects of life.

By the end of the 19th century, the horrors of progress began to take hold in the sociological imagination, a theme that persisted into the 20th century through Foucault and his contemporaries. There are the cheerleaders for sure: Marshall McLuhan – brilliant as he was – could see very little dark side to the explosion of electronic media, for instance. And it is difficult to locate the downsides of advances in medicine or public health technologies without undue stretching. But Reitman is some sort of original and popular voice, struggling anew with the complex interface between rapidly-evolving technology (communication, transportation, finance) and human relations. It’s not a bad addition to a syllabus.

Let's start with Weber, the wildly abbreviated version: With regard to technology, progress, and capitalism, Weber detected a linear trend toward increasing rationalization, systematization, and routinization. In all aspects of life -- from the nature of organizations to the structure of religions -- we seek efficiency and system, in order to gain profit, leisure time, and fulfillment. This drive toward increasing organization, in all its manifestations, is too powerful to fight, given its clear appeal and "scientific" grounding.

Yet, Weber notes, all of this seeming success ultimately makes us less human: With increasing rationalization, we lose our collective spirit. He said, famously, that "each man becomes a little cog in the machine and, aware of this, his one preoccupation is whether he can become a bigger cog," a set of insights that drove him to despair. There are, Weber argued, occasional charismatic leaders that shake up our tidy world of rational calculation. But charismatic movements and people succumb to the inevitability of rationalization, crushed by a culture driven to success, results, and materialism. With no way out, Weber posits, we shall find ourselves in an "iron cage" of rationality, and the human heart will be impossible to locate.

To the film: Ryan Bingham (Clooney) is a consultant who spends most of his life on planes and in airports, traveling the nation as a professional terminator. He is with a firm hired by companies to lay off workers face-to-face (so the employer doesn’t have to), hand them a packet with their severance details, and deliver banal bits of inspiration intended to contain any extreme emotional reaction on the part of the unlucky employee. It’s a perfect Weberian job: Bingham produces nothing truly meaningful, keeps the wheels of capitalism spinning, has no personal relations muddying up his work, and makes good money for the firm.

This all goes well for Bingham; he has no interest in settling down (at least at the start of the film), and being in the air keeps his adrenaline pumping. But his firm has even higher ambitions to rationalize their business model, and with the help of a naïve 20-something M.B.A. type, moves to a system where professionals like Bingham can fire people over videoconference, hence saving millions in travel costs. At the end of the film, due to some unhappy results, the new system is pulled back for more study, and Bingham and colleagues get back on the road to once again fire people in person, which has more heart than the videoconference firing.

A victory against the forces of rationalization? After all, when Bingham fires people in-person, there is something of a human touch. But the film undercuts that thesis as well, with another character, a woman professional, also a road warrior, Alex Goran (played by Vera Farmiga). Goran is attractive and warm, but at base is even more mercenary than Bingham: She too lives in the air, has impressive frequent flyer tallies, and is in all the premium classes that one can aspire to (car rental, hotel, airline, so forth).

Bingham is impressed, having finally met his female match (she quips: “I’m you with a vagina”), finds her in hotels across the country for sex appointments, falls in love with her, finds his heart, and is badly jilted in the end (Goran is married, although she had never revealed this to Bingham). And while he may be badly hurt, she is sincerely puzzled that he failed to understand their unspoken contract: Why, he was part of her rationalized system – husband and family in Chicago, fulfilling career, and Bingham for pleasure on the road.

One of the nice twists of the film is that the female character is a more highly evolved Weberian being than are the men: She has a seemingly happy life – she is content, not alienated or complaining – while Bingham struggles with the rationalization of love, the one aspect of human interaction he apparently thought could not succumb to a culture of calculation. He wasn’t paying for the sex after all; he actually liked her.

While Goran’s character -- a Weberian monster of sorts -- might worry us, she underscores a central problem with the rationalization thesis in an age of social networking, texting, and air travel. Weber and his followers did not foresee the humanization of technology that we see now, and I too have been slow to come to this. For years I taught my students about Weber’s iron cage; they understood it and they appreciated it. They understood how the ATM – for all its efficiencies – lessens human interaction (you’ll not meet anyone in a long bank line these days). They understood what is lost when poll results stand in for qualitative opinion expression, or how a phone call is essentially less human than a face-to-face interaction. The tension between progress and human connectedness – that it was a tradeoff, in fact – seemed to make good sense.

But I struggle to hold up my side of the argument these days. Students insist that their connectedness with friends and strangers, through communication technology, is real, fulfilling, fun, sincere, and intimate (e.g., “sexting”). Weber and I are dinosaurs who have no room in our theorizing for the highly social, extraordinarily human interaction that the Internet has enabled. Technology itself, the force we feared would crush the human spirit, turns out to enhance it.

Or so our students argue. We go round and round on this. And perhaps even those of us who have wrapped much of our intellectual existence around theorists like Weber will see the light, and treat those theories as important, but entirely historically-bound. Up in the Air passes no judgment on Goran’s lifestyle, and in fact, she may be the Übermensch. She controls her destiny and she directs the rationalization of her emotional life. While world-weary (a lot of airport bars, a lot of men), she has found her happiness, while Bingham remains a pathetic, troubled amateur.

Up in the Air encourages a revision of some Weberian views, but also takes on some of our mid-20th century sociological giants as well. Robert Merton, working in the tradition of Tönnies and Weber, argued that the dominant media of his day – radio – had produced what he called pseudo-Gemeinschaft or the "feigning of personal concern and intimacy with others in order to manipulate them the better," for profit, typically. Whether it’s selling war bonds (he wrote on Kate Smith’s campaign) or the perpetual fake-friendly "it’s a pleasure to serve you" we hear constantly, Merton was bothered by the niceties layered atop brute business motive. Is it their pleasure or not? Do they sincerely like to serve us, or do they get points for it on a performance review?

In Up in the Air, our protagonist – thanks to his frequent flying – gets the special "personal" treatment from airline professionals and others. He knows it’s fake, but it is still a pleasurable and valued aspect of daily life. When I raise the old Merton argument with my students these days, they are not bothered by it at all, and Reitman sees the niceties much the same way -- as the state of nature in contemporary capitalism, not a repulsive, slavish persona designed by corporate headquarters. When Bingham finally gets his reward for travelling an extraordinary number of miles on the airline – a personal meeting with the top pilot – he is at a loss for words, after imagining the moment a hundred times in his fantasies. Even when we’ve survived the countless niceties and earned the real human touch, it’s not that great after all, another puzzle for our backward hero.

It is far too generous to say that McLuhan was right, that technology has made us more human, brought us together in a global village of understanding, encouraged tolerance of difference, and connected us to our essential, spiritual, primitive and fuller selves. He slips and slides, preaches a bizarre narrative of human history, and ignores social structure and power dynamics as much as possible. But he did, and profoundly so, foresee something of the social networking of today -- how light might shine through what looks like a mechanical, calculating, and cold world of technological progress. Up in the Air sides with McLuhan and with my students: The film gives one answer to a depressed Weber, but my generation -- at least -- feels empty at the end, as we go back up in the air with Clooney.

Author/s: 
Susan Herbst
Author's email: 
info@insidehighered.com

Susan Herbst is chief academic officer of the University System of Georgia and professor of public policy at Georgia Tech.

Andy Warhol, Then and Now

In two weeks, the National Book Critics Circle will vote on this year’s awards, and so, of late, I am reading until my eyes bleed. Well, not literally. At least, not yet. But it is a constant reminder of one's limits -- especially of the brain's plasticity. The ability to absorb new impressions is not limitless.

But one passage in Edmund White’s City Boy: My Life in New York During the 1960s and ‘70s (a finalist in the memoir category, published by Bloomsbury) did leave a trace, and it seems worth passing along. The author is a prominent gay novelist who was a founding member of the New York Institute for the Humanities. One generation’s gossip is the next one’s cultural history, and White has recorded plenty that others might prefer to forget. City Boy will be remembered in particular for its chapter on Susan Sontag. White says that it is unfortunate she did not win the Nobel Prize, because then she would have been nicer to people.

But the lines that have stayed with me appear earlier in the book, as White reflects on the cultural shift underway in New York during the 1960s. The old order of modernist high seriousness was not quite over; the new era of Pop Art and Sontag's "new sensibility" had barely begun.

White stood on the fault line:

"I still idolized difficult modern poets such as Ezra Pound and Wallace Stevens," he writes, "and I listened with uncomprehending seriousness to the music of Schoenberg. Later I would learn to pick and choose my idiosyncratic way through the ranks of canonical writers, composer, artists, and filmmakers, but in my twenties I still had an unquestioning admiration for the Great -- who were Great precisely because they were Great. Only later would I begin to see the selling of high art as just one more form of commercialism. In my twenties if even a tenth reading of Mallarmé failed to yield up its treasures, the fault was mine, not his. If my eyes swooned shut while I read The Sweet Cheat Gone, Proust's pacing was never called into question, just my intelligence and dedication and sensitivity. And I still entertain those sacralizing preconceptions about high art. I still admire what is difficult, though I now recognize it's a 'period' taste and that my generation was the last to give a damn. Though we were atheists, we were, strangely enough, preparing ourselves for God's great Quiz Show; we had to know everything because we were convinced we would be tested on it -- in our next life."

This is a bit overstated. Young writers at a blog like The New Inquiry share something of that " 'period' taste," for example. Here and there, it seems, "sacralizing preconceptions about high art" have survived, despite inhospitable circumstances.

White's comments caught my bloodshot eye because I had been thinking about Arthur C. Danto's short book Andy Warhol, published late last year by Yale University Press. (It is not among the finalists for the NBCC award in criticism, which now looks, to my bloodshot eye, like an unfortunate oversight.)

It was in his article “The Artworld,” published in The Journal of Philosophy in 1964, that Danto singled out for attention the stack of Brillo boxes that Warhol had produced in his studio and displayed in a gallery in New York. Danto maintained that this was a decisive event in aesthetic history: a moment when questions about what constituted a piece of art (mimesis? beauty? uniqueness?) were posed in a new way. Danto, who is now professor emeritus of philosophy at Columbia University, has never backed down from this position. He has subsequently called Warhol “the nearest thing to a philosophical genius the history of art has produced.”

It is easy to imagine Warhol's response to this, assuming he ever saw The Journal of Philosophy: “Wow. That’s really great.”

Danto's assessment must be distinguished from other expressions of enthusiasm for Warhol's work at the time. One critic assumed that Warhol's affectlessness was inspired by a profound appreciation for Brecht’s alienation effect; others saw his paintings as a radical challenge to consumerism and mass uniformity.

This was pretty wide of the mark. The evidence suggests that Warhol’s work was far more celebratory than critical. He painted Campbell’s soup cans because he ate Campell’s soup. He created giant images based on sensational news photos of car crashes and acts of violence -- but this was not a complaint about cultural rubbernecking. Warhol just put it into a new context (the art gallery) where people would otherwise pretend it did not exist.

“He represented the world that Americans lived in,” writes Danto in his book, “by holding up a mirror to it, so that they could see themselves in its reflection. It was a world that was largely predictable through its repetitions, one day like another, but that orderliness could be dashed to pieces by crashes and outbreaks that are our nightmares: accidents and unforeseen dangers that make the evening news and then, except for those immediately affected by them, get replaced by other horrors that the newspapers are glad to illustrate with images of torn bodies and shattered lives.... In his own way, Andy did for American society what Norman Rockwell had done.”

It seems like an anomalous take on an artist whose body of work also includes films in which drag queens inject themselves with amphetamines. But I think Danto is on to something. In Warhol, he finds an artistic figure who fused conceptual experimentation with unabashed mimeticism. His work portrays a recognizable world. And Warhol’s sensibility would never think to change or challenge any of it.

Chance favors the prepared mind. While writing this column, I happened to look over a few issues of The Rag, one of the original underground newspapers of the 1960s, published in Austin by students at the University of Texas. (It lasted until 1977.) The second issue, dated October 17, 1966, has a lead article about the struggles of the Sexual Freedom League. The back cover announces that the Thirteenth Floor Elevators had just recorded their first album in Dallas the week before. And inside, there is a discussion of Andy Warhol’s cinema by one Thorne Dreyer, who is identified, on the masthead, not as the Rag’s editor but as its “funnel.”

The article opens with an account of a recent showing, of the 35-minute film Warhol film “Blow Job” at another university. The titular action is all off-screen. Warhol's camera records only the facial expressions of the recipient. Well before the happy ending, a member of the audience stood up and yelled, “We came to get a blow job and we ended up getting screwed.” (This anecdote seems to have passed into the Warhol lore. I have seen it repeated in various places, though Danto instead mentions the viewers who began singing “He shall never come” to the tune of the civil-right anthem.)

Dreyer goes on to discuss the recent screening at UT of another Warhol film, which consisted of members of the artist's entourage hanging out and acting silly. The reviewer calls it “mediocrity for mediocrity’s sake.” He then provides an interpretation of Warhol that I copy into the digital record for its interest as an example of the contemporary response to his desacralizing efforts -- and for its utterly un-Danto-esque assessment of the artist's philosophical implications.

“Warhol’s message is nihilism," writes Dreyer. "Man in his social relations, when analyzed in the light of pure objectivity and cold intellectualism, is ridiculous (not absurd). And existence is chaos. But what is this ‘objectivity’? How does one obtain it? By not editing his film and thus creating ‘real time’? By boring the viewer into some sort of ‘realization’? But then, is not ‘objectivity’ just as arbitrary and artificial a category as any other? Warhol suggests there is a void. He fills it with emptiness. At least he is pure. He doesn’t cloud the issue with aesthetics.”

And so the piece ends. I doubt a copy ever reached Warhol. It is not hard to imagine how he would have responded, though: “It gives me something to do.” The line between nihilism and affirmation could be awfully thin when Warhol drew it.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Arts
Back to Top