Arts

The Moviegoer

Whatever happened to cinephilia? Does it still exist? I mean, in particular, the devotion of otherwise bookish souls to the screen. (The big screen, that is, not kind you are looking at now.) Do they still go to movies the way they once did? With anything like the passion, that is – the connoisseurship, the sheer appetite for seeing and comparing and discussing films?

I don’t think so. At least very few people that I know do. And certainly not in the way documented in Reborn (Farrar, Straus and Giroux) the recently published edition of Susan Sontag’s journals, which includes a few pages from a notebook listing the dozens of films the author attended over just three weeks in early 1961. An editorial comment provides more detail about Sontag’s record of her moviegoing that year: “On no occasion is there a break of more than four days between films seen; most often, SS notes having seen at least one, and not infrequently two or three per day.”

This was not just one writer’s personal quirk. It was clearly a generational phenomenon. In a memoir of his days as a student of philosophy at the Sorbonne in the late fifties and early sixties, the French political theorist Regis Debray describes how he and his friends would go from seminars to the cinema as often as their stipends allowed.

“We could afford to enjoy it several times a week,” he writes. “And that is not counting those crisis days when our satisfied and yet insatiable desire made us spend whole afternoons in its darkness. No sooner had we come out, scarcely had we left its embrace, our eyes still half-blind, than we would sit round a café table going over every detail.... Determinedly we discussed the montage, tracking shots, lighting, rhythms. There were directors, unknown to the wider public, whose names I have now forgotten, who let slip these passwords to the in-group of film enthusiasts. Are they still remembered, these names we went such distances to see? .... It may well be the case that our best and most sincere moments were those spent in front of the screen.”

Debray wrote this account of cinemania in the late sprint of 1967, while imprisoned in Bolivia following his capture by the military. He had gone there on a mission to see Che Guevara. An actor bearing a striking resemblance to the young Debray appears in the second part of Stephen Soderberg’s Che, now in theaters.

That passage from his Prison Writings (published by Random House in the early 1970s and long out of print; some university press might want to look into this) came to mind on a recent weekday afternoon.

After a marathon course of reading for several days, I was sick of print, let alone of writing, and had snuck off to see Soderberg’s film while it was still in the theater, on the assumption that it would lose something on the video screen. There was mild guilt: a feeling that, after all, I really ought to be doing some work. Debray ended up feeling a bit of guilt as well. Between trips to the cinema and arguing over concepts in Louis Althusser’s classroom, he found himself craving a more immediate sense of life – which was, in part, how he ended in the jungles of Bolivia, and then in its prisons.

Be that as it may, there was something appealing about this recollection of his younger self, which he composed at the ripe old age of 26. The same spirit comes through in the early pages of Richard Brody's Everything is Cinema: The Working Life of Jean-Luc Godard (Metropolitan Books) and now a finalist for one of the National Book Critics Circle awards. Brody evokes the world of cinema clubs in Paris that Godard fell into after dropping out of school – from which there emerged a clique of Left Bank intellectuals (including Francois Truffaut, Claude Chabrol, and Eric Rohmer) who first wrote critical essays on film for small magazines and then began directing their own.

They got their education by way of mania – which was communicable: Debray and Sontag were examples of writers who caught it from the New Wave directors. Another would be the novelist, poet, and linguist Pier Paolo Pasolini, who also started making films in the early sixties.

It’s not clear who the contemporary equivalents would be. In the mid-1990s you heard a lot about how Quentin Tarantino had worked in a video store and immersed himself in the history of film in much the same way that the French directors had. But the resemblance is limited at best. Godard engaged in a sustained (if oblique) dialogue with literature and philosophy in his films -- while Tarantino seems to have acquired a formidable command of cinematic technique without ever having anything resembling a thought in his head. Apart, of course, from “violence is cool,” which doesn’t really count.

These stray musings come via my own reading and limited experience. They are impressions, nothing more – and I put them down in full awareness that others may know better.My own sense of cinephilia's decline may reflect the fact that all of the movie theaters in my neighborhood (there used to be six within about a 15 minute walk) have gone out of business over the past ten years.

But over the same period cable television, Netflix, and the Internet have made it easier to see films than ever before. It is not that hard to get access to even fairly obscure work now. Coming across descriptions of Godard’s pre-Breathless short films, I found that they were readily available via YouTube. And while Godard ended up committing a good deal of petty crime to fund those early exercises, few aspiring directors would need to do so now: the tools for moviemaking are readily available.

So have I just gotten trapped (imprisoned, like Debray in Bolivia) by secondhand nostalgia? It wouldn't be the first time. Is cinephilia actually alive and well? Is there an underground renaissance, an alternative scene of digital cine clubs that I’m just not hearing about? Are you framing shots to take your mind off grad school or the job market? It would be good to think so -- to imagine a new New Wave, about to break.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pete Seeger, America's Teacher

The January concert at the Lincoln Memorial celebrating the inauguration of President Barack Obama offered many stirring moments, but perhaps its highlight was Pete Seeger leading a chorus of hundreds of thousands of people singing "This Land Is Your Land." This is where Americans expect to see Pete Seeger, raising his voice for change, even when it’s cold outside.

Seeger has been singing folk music for change for more than 70 years now, sometimes in the middle of storms, sometimes causing them. Defiantly leftist, pacifist, and for a decade or so, Communist, Seeger has embraced almost every major reformist cause of the 20th century. He’s sung and spoken out for organized labor, against McCarthyism, in support of Civil Rights, against the Vietnam War, and -- from the deck of the sloop Clearwater, a “ship of song” which he helped to build -- his voice put early wind into the sails of the environmental movement.

Now 89, Seeger has witnessed his own transformation into an icon. President Clinton bestowed the National Medal of the Arts on him in 1994, and The Library of Congress named him a “Living Legend” in 2000. In 2007 PBS released "Pete Seeger: The Power of Song" as part of its American Masters documentary series. Seeger’s 90th birthday concert in New York on May 3rd will be a star-studded affair featuring luminaries from Ani DiFranco to Bruce Springsteen.

One of the words you hear applied most often to Pete Seeger these days is “genius.” A 2005 album even proclaims him a “Genius of Folk.” But instead we should think of Pete Seeger as America’s teacher, an “inconvenient artist” (as Clinton put it) who taught the conflicts before other people got to them.

As the energetic teacher to a nation, Seeger has let his songs do the instruction. He’s the author of evergreens like “If I had a Hammer” and “Turn, Turn, Turn,” and as a member of the Weavers he helped bring folk music into the commercial mainstream before Seeger and the rest of the group were blacklisted during the Red-baiting fifties. But Pete Seeger is perhaps best known as a walking, talking American songbook whose encyclopedic contents are accompanied by his banjo or guitar—and by the voices of his audience. Springsteen, who in 2006 made a memorably vital album of songs called "The Seeger Sessions," says that, “Pete’s library is so vast that the entire history of the country is there.”

Like Benjamin Franklin before him, Pete Seeger has long sought to lead an exemplary life. Woody Guthrie, whose friendship with the teenaged Seeger in the late 1930s became one of the formative events in the young man’s artistic development, was amazed that Seeger didn’t drink, smoke, or chase women. Some of Seeger’s asceticism may have come from the New England culture he was brought up in, but it’s also clear that Seeger consciously chose to live a certain way, and that his principled choices inform his entire adulthood.

And so when the time came for Seeger to face the Communist-hunting House Un-American Activities Committee (HUAC) in 1955, he refused to discuss his politics or his associations. He didn’t plead the Fifth, though. Instead, he took the First. “Using the Fifth Amendment,” Seeger explained, “is in effect saying, ‘you have no right to ask me this question’; but using the First Amendment means in effect, ‘you have no right to ask any American such questions.' ” This courageous gesture resulted in a conviction for Contempt of Congress that kept Seeger suspended, a hair from jail, for nearly seven years before it was tossed out on a technicality in 1962.

***

Pete Seeger has been a teacher to three generations of my family. I'm in the middle one of the three, and my memorable Pete Seeger moments include the 1980 reunion of the Weavers at Carnegie Hall, a final appearance by the group shortly before one of its mainstays, Lee Hays, died. (That reunion is the subject of the excellent 1982 documentary "Wasn’t That a Time.")

The Weavers, a quartet that featured the harmonies of Seeger, Hays, Ronnie Gilbert and Fred Hellerman, achieved unprecedented commercial success for a folk music group during the early fifties, selling millions of records. Seeger reacted ambivalently to his sudden immersion in the mainstream, sometimes wearing red socks with his tuxedos.

The anti-Communist blacklist cut down the Weavers in the middle of their hit parade. Banned first from television and then from theaters and clubs, the group disbanded in 1952. They defied the blacklist to reunite in 1955, but Seeger left the group in 1957 to pay more attention to his solo career. The 1980 reunion was the first appearance together of the original Weavers lineup since the early 1960s. I’ve attended some joyful concerts in my life, but I’ve never seen an outpouring of love between audience and performers like that one.

These days I like getting my daughter, KC, into the same room with Pete Seeger whenever possible. My theory is that hanging around with incorruptible people is a character builder. KC’s first Pete Seeger concert was a 2007 benefit. Pete walked on stage that night after being introduced, and hundreds of people popped up to give him a standing ovation before he sang a note. I've been talking to KC (who was then 8) about that ovation in the months since, about how the audience was saying "Thank you for living your life the way that you have, and for making the choices that you did." I’ve suggested to her that getting an ovation like that is better than being rich, since you can't buy it. What better reward is there for a teacher?

Pete Seeger’s voice isn’t what it used to be, but he does a few songs, leads some singalongs, tells a few stories, visits with the folks. He played some songs that KC knows, including "This Land Is Your Land," which she sang along with delightedly. Someone requested "Old Dan Tucker," and he said, "You've been listening to Bruce Springsteen!" before he played it with a handful of extra verses that nobody but he and a handful of music historians know. He also led a singalong of "Somewhere Over the Rainbow," ending by insisting that the audience add two words to the last line even though he said that Yip Harburg, the song’s author, would have objected: "If birds fly over the rainbow, then why oh why can't you and I?" The change made my heart swell.

I've also been talking to KC about Pete Seeger's different causes. The soundtrack to our discussions—which have been mostly in the car, where she has less to distract her—has been his songs. She frequently requests Pete Seeger music now, especially the Weavers and the older stuff. She likes some of his antiwar music too, especially the controversial classic “Waist Deep in the Big Muddy.”

“Big Muddy” tells a story about a platoon during World War II whose obstinate captain ignores advice and leads them to the brink of disaster. The song’s transparently scabrous commentary on President Lyndon Johnson’s conduct of the Vietnam War led CBS to censor the song when Seeger first taped it as part of "The Smothers Brothers Comedy Hour" in 1967. But the resulting protests — including Seeger’s own warning that “the public should know that their airwaves are censored for ideas as well as sex” — led the network to back down and invite Seeger to sing “Big Muddy” on the show a second time. This time it was broadcast. The song sadly retains its topical resonance. Seeger rerecorded it last year for his most recent album, At 89, to protest the war in Iraq.

Union activism is the subject of most of KC’s favorite Pete Seeger songs. KC knows about unions in a general sense from hearing her mother, a union lawyer, talk about her work, but she's getting a full vocabulary now, since I'll hit the pause button to explain what a scab is, or how picket lines got their name.

The result is that my daughter has become the oddest of birds: a nine year-old Old Time Leftist. She sings "Which Side Are You On?" and "Solidarity Forever" as though she were marching herself. She loves "Talkin' Union" ("You can always tell a stool by the yaller streak runnin' down his back") and "Union Maid" ("Oh, you can't scare me, I'm stickin' to the union!"). I feel a strange mixture of pride and amusement when I hear KC singing those songs. Her labor choruses have made my mother very happy, for she got her politics from Pete Seeger. She grew up in a household where no one talked about such things, and when she started at Brooklyn College in the late forties, she attended Pete Seeger concerts on campus, where she learned from him. I grew up on Pete Seeger, Woody Guthrie, and Weavers records, and my extended family—from grandparents to grandchildren—will be attending Seeger’s coming birthday concert.

***

But there’s an issue that Pete Seeger missed: the women’s liberation movement of the 1960s and afterwards. One of the great songs Seeger popularized, for example, “Little Boxes,” indicts conformity, but only for men. “The boys go into business, and marry and raise a family,” goes the song. These men play golf and “drink their martini dry,” but the women in their lives are nowhere heard from. At a time when Betty Freidan was leading a charge against a different kind of barricade, Pete Seeger continued to sing about a default person who was always male, attended by an invisible female helpmeet. Some of those lyrics make me wince today — and when my daughter is around, they also make me reach for the pause button to explain.

Pete Seeger had a monumentally atypical career, but the way that he pursued it was typical of the men of his time. When he wanted to escape the restrictions of the blacklist by singing his way around the world, his wife Toshi dutifully pulled up stakes with their children, and accompanied him on a one-man peace, love, and understanding tour that encompassed over 30 countries. When Seeger wanted to bypass the television networks’ blacklist of him, he devised a PBS program called "Pete Seeger’s Rainbow Quest," which featured Pete and the guest of the week sitting around a kitchen table informally playing and talking about music. The show ran for 39 episodes in 1967. (Many of them are available on DVD, and are well worth checking out.) Toshi Seeger produced "Pete Seeger’s Rainbow Quest," but she’s listed in the credits as “Chief Cook and Bottle Washer.”

In the sunset of his epic life, Pete Seeger now proclaims the ways that his wife made that life possible. He touts Toshi’s contributions to their work, and repents the burden that he laid on her, a burden that she lovingly bore. Pete Seeger’s commitment to Toshi Seeger’s work underscores in a different way the credo by which he has lived his life: what he calls “participation.” Thus does Pete Seeger nourish his unshakable commitment to the communities around him.

Like the best teachers, he has always understood the value of learning — for himself as well as his students.

Author/s: 
Leonard Cassuto
Author's email: 
info@insidehighered.com

Leonard Cassuto is a professor of English at Fordham University. His book Hard-Boiled Sentimentality: The Secret History of American Crime Stories, has been nominated for a 2009 Edgar Award by the Mystery Writers of America.

Analyze This

On the evening of June 10, 2007, several million people watching "The Sopranos" experienced a moment of simultaneous bewilderment. During the final scene of the final episode of its final season (a montage in which the tension built up steadily from cut to cut) the screen went blank -- and the soundtrack, consisting of the Journey power ballad "Don't Stop Believing," had gone dead. The impending narrative climax never arrived. But neither was this an anticlimax, exactly; it did not seem to be related at all to the events taking place onscreen. Many viewers probably assumed it was a technical glitch.

Once the credits began rolling, any anger at the service provider was usually redirected to the program’s creators. The willing suspension of disbelief had been not so much broken as violated.The blank screen could be (and was) interpreted variously: as an indication that Tony Soprano was blown away by an assassin, perhaps, or as a gesture of hostility by David Chase (towards the audience, or HBO, or even the notion of closure itself).

But analysis was not payoff. The end remained frustrating. The Sopranos offered its viewers an aporia they couldn’t refuse.

As I write this column (scheduled to appear two years to the day after that final episode aired) the bibliography of academic commentary on "The Sopranos" runs to more than half a dozen volumes. That's not counting all the stray conference papers and scattered volumes with chapters on it, let alone the knickknack books offering Tony Soprano's management secrets.

Life being as short as it is, I have not kept up with the literature, but did recently pause in the middle of watching the third season to read the latest book-length commentary, The Sopranos by Dana Polan, a professor of cinema studies at New York University, published in March by Duke University Press.

His departmental affiliation notwithstanding, Polan’s analysis challenges the idea that "The Sopranos" was much more akin to film than to television programming.This is certainly one of the more familiar tropes in critical discussion of "The Sopranos," whether around the water cooler or in more formal settings. An associated line of thought identifies it with a tradition of “quality TV” -- as when a critic in The New York Times suggested that the series “is strangely like 'Brideshead Revisited,' 'The Singing Detective,' and 'I, Claudius.' ”

(The fact that Tony Soprano’s mother is named Livia certainly did seem like a nod to the latter show’s monstrous matriarch. At least one classicist has argued that the real-life Livia Drusilla of the first century was the victim of an unscrupulous smear campaign. I mention this for the convenience of anyone who wants to attempt a revisionist interpretation of Livia Soprano’s role. Good luck with that.)

Rather than going along with the familiar judgment that "The Sopranos" stood above and apart from the usual run of mass-cultural fare, Polan reads it as continuous with both the traditions of genre television and the hierarchy-scrambling protocols of the postmodern condition.

The thugs in Tony Soprano’s crew are familiar, obsessed even, with the Godfather films, and cite them constantly – a bit of intertextuality that left the audience constantly scrambling to find and extrapolate on allusions within the unfolding story. But Polan maintains that the show was structured at least as much by parallels to the old-fashioned situation comedy. Or rather, to the especially ironic variation on sitcom themes found in one program in particular, "The Simpsons."

“In this revised form,” writes Polan, “the job front is a complicated site lorded over by capricious and all-powerful bosses; the sons are slackers who would prefer to get in trouble or watch television than succeed at school; the daughter is a liberal and intellectually ambitious child who is dismayed by her father’s déclassé way of life and political incorrectness but who deep down loves him and looks for moments of conciliation; the wife is a homemaker who often searches for something meaningful to her existence and frequently tries to bring cultural or moral enrichment into the home; the bar is a male sanctuary; and there is an overall tone of postmodern fascination with citation and a general sense of today’s life as lived out in an immersion in popular culture and with behaviors frequently modeled on that culture.”

Someone posting at the New York Times blog Paper Cuts a few months ago took the entirely predictable route of charging the book with “taking all the fun out of our favorite unstable texts” by smearing jargon on slices of the show.

But surely I cannot be the only reader who will respond with a kind of wistful nostalgia to Polan’s recurrent, urgent insistence that postmodern irony is organizing principle of "The Sopranos."

The show “frustrates easy judgment,” he writes, “by incorporating a multiplicity of critical positions into the text so that it becomes unclear to what extent there is one overall moral or thematic attitude that governs the work.”

Man, that really takes me back. While "The Sopranos" itself premiered in 1999, this interpretation has something very 1989-ish about it.... The Berlin Wall was in ruins, and so were the metanarratives. Joe Isuzu was introducing a new generation to the liar's paradox. And it seemed like if you could just make your irony sufficiently ironic, brute contingency would never touch you. Those were "good" times.

Yet formally self-conscious and deliberately ambiguous though it tended to be, "The Sopranos" was by no means so completely decentered in its “overall moral or thematic attitude” as all that. On the contrary, it seems to me to have been very definitely grounded what might be called (for want of any better phrase) a deeply pessimistic Freudian moral sensibility.

That label may sound almost oxymoronic to most people. We tend to think of Freud’s work as a negation of moralism: an attempt to liberate the individual from the excessive demands of the social order. But his view of the world was a far cry from that of the therapeutic culture that took shape in his wake. He was skeptical about about how much insight most patients could ever achieve -- let alone the benefits following from the effort. The mass of humanity, Freud said, was “riffraff.” The best the analyst could hope for was to cure the client of enough “neurotic misery” to be able to deal with “ordinary human unhappiness.”

A regular consumer of new therapeutic commodities like Tony’s sister Janice Soprano may expect to get some profound and satisfying self-transformation for her money. But the original psychoanalytic perspective was far more dubious. Freud also had misgivings about how his work would be received in the United States. While approaching by ship in 1909 (this year marks the centennial of his lectures at Clark University), Freud took exception to Jung’s remark that they were bringing enlightenment to the New World. No, said Freud, their ship was delivering the plague.

Indeed, someone like Tony Soprano entering treatment would have been one of the old doctor’s worst nightmares about the fate of his work. The question of Dr. Melfi’s willingness to continue treating Tony (not simply the danger this presents to her, but the moral puzzle of what “improvement” would even mean in the case of a sociopath) runs throughout the series.

When Carmela Soprano decides to seek therapy, she is referred to an old immigrant analyst named Dr. Krakower who refuses to indulge her belief that Tony is fundamentally decent. This is, of course, something the viewers, too, have been encouraged to believe from time to time -- in spite of seeing it disproved in one brutal encounter after another.

“Many patients want to be excused for their current predicament,” says Dr. Krakower, “because of events that occurred in their childhood. That's what psychiatry has become in America. Visit any shopping mall or ethnic pride parade, and witness the results.” He then refuses to accept payment from Carmela, or to continue treatment, until she breaks with Tony: “I'm not charging you because I won't take blood money, and you can't, either. One thing you can never say is that you haven't been told.”

Dr. Krakower then disappears from the show. A present absence, so to speak. We, the viewers, have by that point had numerous reminders that we are deriving vicarious pleasure from seeing how Tony and his crew earn the blood money that Dr. Krakower won't touch. We have been given a very clear indication of the difference between complicity and some version of the Freudian moral stance.

The deep pessimism of that outlook comes through time and again as we see how powerful are the psychic undercurrents within the family. Far from it being “unclear to what extent there is one overall moral or thematic attitude that governs the work,” we are on a terrain of almost Victorian naturalism, in which rare moments of insight are no match for the blind play of urges that define each character.

Take, again, the example of New Age gangster moll Janice Soprano. In his book, Polan notes that she “keeps hooking up with the dysfunctional and violent heads of Mafia crews within Tony’s jurisdiction.” In spite of everything, she never learns from her mistakes.

Polan treats this as an example of “the amnesia plot” – a sly, pomo-ironic wink, perhaps, at all those times on "Gilligan’s Island" when somebody got hit on the head with a coconut.

But surely some other interpretation is possible. Outside the play of televisual signifiers, there are people who, in one crucial area or other of their lives, never learn a damned thing – or if they do, it still makes no difference, because they make the same mistakes each time a fresh opportunity presents itself. This is, perhaps, the essence of Freud’s distinction between neurotic misery and normal unhappiness.

Not that the old misogynist necessarily gives us the key to understanding Janice Soprano. But her behavior, cycling through its compulsions in spite of various therapists and gurus, is consistent with Freud’s grimmer estimates of human nature.

The virtual impossibility of changing one’s life (even when staying alive depends on it) was also the lesson of the gay mobster Vito Spatagfore’s trajectory during the final season. Having fled both the closet and his murderously homophobic peers, Vito has every reason to settle down to an idyllic life in New Hampshire, where he has both a low-carbohydrate diet and a studly fireman boyfriend.

But Vito feels compelled to return to New Jersey and his old way of life, with predictable results. It all plays out like something inspired by Beyond the Pleasure Principle, in which Freud’s speculations on the repetition compulsion lead him to the concept of thanatos, the death drive.

When the screen went blank two years ago, it was, among other things, a disruption of our daydream-like engrossment in the world of the Sopranos. It was a sharp, even brutal reminder that the viewer had made an investment in Tony's life. The audience was left frustrated: we wanted him to either escape the consequences of his actions or get killed. Neither motive is exactly creditable, but daydreams often manifest truths we'd rather disavow.

Polan’s book is often insightful about the visual dimension of The Sopranos, if a bit reductive about treating its self-consciousness as generically postmodern. The program’s long shadow, he writes, “tells us something serious about the workings of popular culture in the media economies of today. Irony sells, and that matters.”

We all make different meanings out of the raw materials provided by any given cultural artifact – so in the spirit of hermeneutic laissez faire, I won’t quibble. But the realization that "irony sells" does not exhaust the show's meaning. It seems, rather, like something one of the brighter gangsters might say.

For this viewer's part, at least, the lesson of "The Sopranos" is rather different: Life is over soon enough, and it is not full of second chances – even though we tend to expect them. (We often prove really good at kidding ourselves about how many chances there are.) Be as ironic about life as you want; it doesn’t help. You end up just as dead.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

The Plug-In Syllabus

Whether or not it still makes sense to call PBS “educational television” (some of us are still bitter about “Yanni at the Acropolis”) but there was once a time when didacticism was indeed its mandate. And for my part, it’s impossible not to think of KERA, the network’s affiliate in Dallas, as an alma mater of sorts.

KERA had the distinction of being the station that introduced American viewers to "Monty Python’s Flying Circus." In a more serious vein of absurdism (but still with some humor) it also broadcast “Waiting for Godot” in the late 1970s, which sent my teenage existentialism up to a new plateau. It also used to show, quite frequently in fact, "The Naked Civil Servant," which can’t have met with approval from the Southern Baptist Convention.

I remember watching a documentary about Salvador Dali at least two or three times – mind suitably boggled by its clips from Un Chien Andalou. And late at night on the weekends there was a program called "One Star Theater," a home for low-budget movies that were surrealist by default. In one of them, as I recall, the human survivors of some disaster were attacked by giant shrews, played by dogs dressed in shrew costumes. (Even calling them “costumes” may be an overstatement.)

Exposure to Samuel Beckett, art-appreciation documentaries, "Masterpiece Theatre," and grade Z film gave me the rudiments of an aesthetic education. And a good thing, too, because nobody in the local school system would have used the expression “aesthetic education,” or considered it worth offering.

But my TV curriculum was broader still. There were dueling series on economics hosted by Milton Friedman and John Kenneth Galbraith. I seem to recall a program where Henry Steele Commager talked about Alexis de Tocqueville’s Democracy in America at some length. And on each episode of a series called "Connections," the wry host, James Burke, covered the interaction of technology and culture by tracing improbable chains of cause and effect down the course of four or five centuries.

There was Dick Cavett’s program, which has migrated from network to network over the decades but called PBS home from 1977 to 1982. On it, Allen Ginsberg answered questions (sort of) and tried to sing (this was just bearable), and Truman Capote mumbled through the barbiturates. Susan Sontag stopped by to radiate the dark glamor that lets you get away with anything. Other guests talked about their films and books, and gave a glimpse of whatever serious adults in New York were serious about, back then. It was always a revelation.

Meanwhile there was "Firing Line," where William F. Buckley made conservative arguments against his guests without yelling at them. Evidently his approach was too subtle even by the standards of the day. One of my classmates was sure that Buckley must be a liberal because of the way he talked – the accent, the polysyllables, the sneer. (Not to mention the way his tongue darted out from time to time, like that of a Komodo dragon about to devour a goat.) I explained that Buckley was in fact an ardent supporter of not-yet-president Ronald Reagan. My friend decided that he would try "Firing Line" again.

Politics aside, the show was good for the vocabulary. I probably owed my National Merit Scholarship to William F. Buckley.

KERA's programming tended (apart from "One Star Theater") to be earnestly and even aggressively middlebrow in spirit.Just what happened to that spirit over the next few years is a puzzling question, and can't be divorced from the issue of what happened to the cultural apparatus at large.

Many of the changes were structural, which is another way of saying that they involved money. It was not just that funding for public broadcasting was always being trimmed and challenged. At the same time, new television channels were created by the scores and then the hundreds. Some of the fare that had distinguished educational broadcasting (pop history, talking-heads shows, book chat) was now found elsewhere, spread broadly throughout the cable universe. Which in some ways meant more thinly: the audience dispersed across the dial, the critical mass of nerd concentration harder to reach.

At the same time, PBS itself had -- in the interest of ratings and continuing support -- to take on more and more programming with no didactic intent at all: soporific smooth jazz, antiques shows, concerts in which Pavarotti and Sting joined forces, etc.

Now, to be honest, I was not paying attention while most of these changes were taking place. Educational broadcasting had done its work only too well. I spent the 1980s reading Husserl and whatnot. When the impulse to watch TV kicked in, it involved a craving for something to cool down the brain -- including late-night reruns of shows my teenage self would have scorned with all due high-mindedness.

One of my professors had commented in passing that “Love Boat” was an example of Bakhtinian carnival, albeit in debased form. This sounded plausible. By day, I thought about the epoché’s suspension of judgment regarding the ontological status of the objects within experience. At two in the morning, I suspended all judgment whatsoever and went slumming on the tube. It was a license to consume garbage.

Which all just goes to show that bildung can take some strange turns. But over the past decade or so, I started to think back on my debt to PBS in its starchier and more strenuously uplifting era -- and started to miss it.

A program like “Meeting of the Minds” -- in which Steve Allen sat at a table with actors dressed up like famous artists, scientists, and leaders throughout history -- is something you outgrow, sooner or later. But at the right age, it gives you something that enables you to outgrow it. I’m not persuaded that even the most rigorous semiotic approach to Aaron Spelling’s oeuvre will yield anything like that benefit.

But now the point is moot, right? Public broadcasting has been on blood-thinning drugs for a long time. Even the categories of highbrow, middlebrow, and lowbrow are quaint. Ten years ago John Seabrook coined the term “nobrow” to describe the prevalent cultural mode; it still seems applicable. And you can’t go back to the old didacticism because nobody knows how to pull it off anymore.

So I assumed, anyway, until coming across an interesting development at the website of my alma mater, KERA.

It offered a podcast covering an exhibit at the student gallery of the University of Texas at Arlington devoted to Fluxus – an international avant garde cultural movement from the early 1960s, inspired by Dadaism, but also looking ahead to conceptual and performance art. Another recent segment there (this one available on video) features an interview with the director Philip Haas about his work on a series of film installations at the Kimbell Art Museum in Fort Worth.

At first blush, this seems like a flashback to what was available on KERA 30 years ago – solid and informative, enjoyable in its way but also downright educational. At the same time, the fact that it is available on the internet gives it a much broader potential audience. The story on “Fluxus in Texas,” for example, has drawn comments from Paris and Brussels.

So what’s happening? How is it that old-school cultural earnestness has been revived in a new-media environment? We’ll look into that with next week’s column.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Brandeis Wasn't Wrong

In 2001 I donated my collection of prints by sculptors to the Block Museum of Art at Northwestern, though some of the prints still adorn the walls of my house and won’t get to Evanston until after my death. You can assume -- and you would be right -- that a collector of such works has been a lifetime “consumer” and supporter of the arts.

And yet, I said to myself “good for them” when reports first surfaced last winter that Brandeis intended to sell its collection of modern art, so that the considerable (envisaged) proceeds could support functions closer to the central goals of the university.

Understand that my print collection went to Northwestern because I had been dean of arts and sciences there for thirteen years. Understand also that regarding this issue, my experience as dean trumps my love of art and that is why I disagree with the views expressed in numerous articles in The New York Times and one this month in Inside Higher Ed called “Avoiding the Next Brandeis."

I see a significant role for art museums on higher education campuses. But, with quite special exceptions, I see a very small pedagogic function for colleges and universities to own works of art, especially given the current cost and value of so many of them. I’d rather those museums were reclassified as galleries. To be sure, the provisions of deeds of gift must be scrupulously observed; but assuming that to be the case, let them sell their works of art if the funds thus gained will better serve the institutions’ educational mission.

The premise here is that the roles of museums on campuses are not like those of museums downtown, since the former exist to serve the specific needs and interests of a campus’s students and faculty.

This month’s article in Inside Higher Ed quotes a task force formed by arts groups to figure out ways to avoid the next Brandeis as saying that campus museums should be regarded as “essential to the academic experience and to the entire educational enterprise.”

But why should they be so regarded when, by my admittedly not systematic observations, most of those museums do nothing or very little to deserve to be so regarded? As dean, I had to bludgeon the Block Gallery to present an exhibit of the work of Northwestern’s prize painters, William Conger, Ed Paschke and James Valerio. (This was before the Gallery was transformed into a Museum and long before its current director, David Robertson, came to Northwestern.) Art history departments are mostly held at arm's length by campus museums who prize their (inappropriate) autonomy. Mostly, the museums don’t even know how to communicate with other than art faculty on campus.

It is excellent, therefore, that this cluster of issues is being looked at. In my view, however, the goals sought by the task force for campus art museums are not likely to be realized by means of works of arts owned by museums, but rather by means of exhibits brought in and often locally curated for specific pedagogic purposes.

Members of the task force, make sure, therefore, that you are not just talking to yourselves. You are looking for ways to relate A to B; there must thus be strong representation from both poles. As announced, the organizations participating in the task force are mostly from the Category A: the art museum community.

I strongly recommend that it also include not only representation from the art history and studio art departments, but knowledgeable people who have thoughts about how to involve art museums in educating students who are not primarily concerned with the arts. Indeed, given the way in which so many campus museums lead existences so separate from their campus surroundings, it might even be necessary to initiate reflection about about their possible wider functions. The task force might want to consider forming a committee consisting of a couple of department chairperson, a couple of deans or associate deans, perhaps some interested students assigning them the task of reporting to the museum-powers-that-be how those museums might serve a broad campus constituency.

Accordingly, if the just-formed task force keeps its eye on the ball (as I see it), that Brandeis bomb will have very positive, if unintended, consequences.

Author/s: 
Rudolph H. Weingartner
Author's email: 
newsroom@insidehighered.com

Rudolph H. Weingartner is former dean of arts and sciences at Northwestern University.

The Museum at the Heart of the Academy

When I first learned last winter that the Brandeis University trustees had voted to sell the collections of the Rose Art Museum and close the museum, my reactions were many: concern for Brandeis students who were losing an important learning tool; sadness that a great university was breaking trust with many benefactors; annoyance that the museum industry would be yet again living the trauma of defending our collections as other than semi-liquid assets.

To these were added a suspicion that somehow the Rose must have failed in its campus-wide engagement and in its outreach to key campus constituencies (including its trustees), or those very trustees would never have felt they could make that particular decision, no matter how great the budget gap facing them. Even as I recognized the right of any university to shutter any program no longer deemed sufficiently important, I shuddered at such a reactive decision.

Nowhere in my response did I consider the “good for them” proclamation made by Rudolph Weingartner in his Views column of October 23 for Inside Higher Ed, arguing against both the pedagogic value of owning works of art and the effectiveness of university museums generally. Most troublingly, in reading of the view of a panel of experts arguing that university museums should be regarded as “essential to the academic experience," Dean Weingartner observes “by my admittedly not systematic observations, most of those museums do nothing or very little to deserve to be so regarded…. Mostly, the museums don’t even know how to communicate with other than art faculty on campus.”

Drawing such conclusions -- and a kind of pleasure in the demise of a fine museum -- on the basis of random evidence seems not to represent the rigors of academic policy making at its best.

More dangerously, this view fails to note either the sheer range and variety of campus museums in the United States or the extent to which many have worked mightily in recent decades to make themselves central to their parent institutions. Long gone are the days when most university museums could be seen as, at best, the laboratory addendum to a department of art or art history. Seeking not merely (although importantly) to shape future art historians and museum professionals, the nation’s best university museums have long been engaged in the practice of fostering critical thinking and visual literacy, the understanding of times and cultures dramatically distinct from our own, the awareness of a common humanity, and thus, ultimately, the shaping of good citizenship.

Here at Princeton University, we have long crossed boundaries to partner our museum with disciplines and departments from the humanities to creative writing to architecture to civil engineering. The Yale Center for British Art routinely connects with fields ranging from natural history to cultural studies; their exhibition this year on the impact of Darwin’s theory of evolution on subsequent creative practice was a model for cutting-edge investigation of value to us all.

The Wolfsonian Museum at Florida International University offered an almost shockingly timely exhibition looking at the art of propaganda during last year’s presidential campaign. The new wing opened this year at the University of Michigan Museum of Art -- which I led until recently -- was designed to architecturally embody and make possible a commitment to deep campus-wide engagement, providing a second home for programming in performance, creative writing, film, and the humanistic disciplines generally.

And many universities increasingly use their art museums in medical curriculums, having discovered that sustained close looking makes their doctors-in-training better diagnosticians. From Dartmouth, to Emory, to Wisconsin, to UCLA, great university museums have shown themselves deeply capable of being essential to the lives of their universities, even as they also often function as enormously beneficial gateways to those universities for the general public.

The argument that academic museums can do these things is no mere abstraction. They are doing these things, and are increasingly recognized as playing an essential role in a time of bottom-line driven programming at many of even our greatest civic museums. With less at stake in the battle for attendance, the university museum can often take on difficult projects whose popularity cannot be assured, advancing the cause of new knowledge presented in accessible ways that yet seek to avoid pandering or the much dreaded “dumbing down." Many of the first thematic exhibitions -- sometimes operating in the sphere of a social history of art, the so-called “new art history” -- took place in our university museums. Increasingly, and happily, the special role of the university museum is recognized by the media: Writing in The New York Times this year, the art critic Holland Cotter observed “The august public museum gave us fabulousness. The tucked away university gallery gave us life: organic, intimate and as fresh as news.”

And why do we university museums so annoyingly feel the need to collect artworks, creating the inevitable drain on resources caused by those pesky stewardship requirements? I offer in answer a fundamental article of faith, that even in the digital age, the sustained engagement with original works of art necessary for teaching, research, and layered learning would be difficult if not impossible if we ceased to be collecting institutions and instead taught only from objects temporarily made available for exhibition.

In the way that great texts live in our libraries, available for revisiting and sustained scholarly investigation, the works of art in our museums offer the possibility of deep critical engagement, close looking, and technical analysis -- made all the deeper when brought together as collections in which dialogues arise through the conversation of objects with each other and with their scholarly interlocutors. Surely a key role of the academy -- the advancement of new knowledge and the challenging of past knowledge -- is that fruit of curatorial, faculty, and student research made possible by the sustained presence of great works of art, whose survival for the future is also thus (and not incidentally) guaranteed.

Like libraries that often also find themselves embattled in times of budget cuts (since typically neither museums nor libraries directly generate tuition streams), great university art museums are a “public good," offering value and possibility to the whole of our university communities as well as to users from outside the walls of the ivory tower. That all university museums do not achieve this centrality of purpose -- often, I suspect, for lack of adequate resourcing by their parent institutions in the perpetual fight against the perception that art represents a “luxury” in the logo-and data-centric university -- is to be regretted. Without question much work remains to be done to make our museums central to the academic experience.

But just as any academic department desires a certain autonomy to define its foci and particular strengths within the university curriculum, no academic museum should be “bludgeoned” into showing the work of particular artists or serving as the handmaiden of narrow administrative modishness. The academic model has never, thank heavens, been one of pure utility, even as we seek to be responsible, effective, and impactful.

For me, the lesson of the Brandeis debacle is the reminder that the fight for the central role of our museums is not won. Contrary to Dean Weingartner’s views, however, that fight has long and often successfully been underway.

Author/s: 
James Christen Steward
Author's email: 
newsroom@insidehighered.com

James Christen Steward is director of the Princeton University Art Museum.

George Clooney Meets Max Weber

Spoiler alert: Max Weber’s life is an open book, thanks in part to Joachim Radkau’s wonderful new 700-page biography, so nothing to spoil there. But this essay does reveal the ending of Jason Rietman’s new film.

Thoughtful, intellectual movies are produced each year in the United States and abroad -- open texts rich with meaning, understood by critics or not. Some writers and directors begin with a premise, others stumble into one, and still others capture the zeitgeist and hit a chord, even if we cannot articulate precisely what it is. For me, not much of a moviegoer and certainly not a film critic, Up in the Air, the highly-acclaimed new movie directed by Jason Reitman (he also directed Juno), and written with Sheldon Turner, resonates powerfully with some of my challenging student conversations of late.

There are no ground-breaking paradigms about human nature introduced in Up in the Air, just as we’ve not seen many of those in academic circles in recent times. But by trying to keep us engaged, Reitman manages to come face to face with the very best of 19th and early 20th century philosophy and sociology. It was during this period that the great theorists of industrialization and technology emerged with force – Marx of course, then Max Weber, Ferdinand Tönnies, and Emile Durkheim among others – exploring the relationships among rationality, morality, community, and the acceleration of technological change in all aspects of life.

By the end of the 19th century, the horrors of progress began to take hold in the sociological imagination, a theme that persisted into the 20th century through Foucault and his contemporaries. There are the cheerleaders for sure: Marshall McLuhan – brilliant as he was – could see very little dark side to the explosion of electronic media, for instance. And it is difficult to locate the downsides of advances in medicine or public health technologies without undue stretching. But Reitman is some sort of original and popular voice, struggling anew with the complex interface between rapidly-evolving technology (communication, transportation, finance) and human relations. It’s not a bad addition to a syllabus.

Let's start with Weber, the wildly abbreviated version: With regard to technology, progress, and capitalism, Weber detected a linear trend toward increasing rationalization, systematization, and routinization. In all aspects of life -- from the nature of organizations to the structure of religions -- we seek efficiency and system, in order to gain profit, leisure time, and fulfillment. This drive toward increasing organization, in all its manifestations, is too powerful to fight, given its clear appeal and "scientific" grounding.

Yet, Weber notes, all of this seeming success ultimately makes us less human: With increasing rationalization, we lose our collective spirit. He said, famously, that "each man becomes a little cog in the machine and, aware of this, his one preoccupation is whether he can become a bigger cog," a set of insights that drove him to despair. There are, Weber argued, occasional charismatic leaders that shake up our tidy world of rational calculation. But charismatic movements and people succumb to the inevitability of rationalization, crushed by a culture driven to success, results, and materialism. With no way out, Weber posits, we shall find ourselves in an "iron cage" of rationality, and the human heart will be impossible to locate.

To the film: Ryan Bingham (Clooney) is a consultant who spends most of his life on planes and in airports, traveling the nation as a professional terminator. He is with a firm hired by companies to lay off workers face-to-face (so the employer doesn’t have to), hand them a packet with their severance details, and deliver banal bits of inspiration intended to contain any extreme emotional reaction on the part of the unlucky employee. It’s a perfect Weberian job: Bingham produces nothing truly meaningful, keeps the wheels of capitalism spinning, has no personal relations muddying up his work, and makes good money for the firm.

This all goes well for Bingham; he has no interest in settling down (at least at the start of the film), and being in the air keeps his adrenaline pumping. But his firm has even higher ambitions to rationalize their business model, and with the help of a naïve 20-something M.B.A. type, moves to a system where professionals like Bingham can fire people over videoconference, hence saving millions in travel costs. At the end of the film, due to some unhappy results, the new system is pulled back for more study, and Bingham and colleagues get back on the road to once again fire people in person, which has more heart than the videoconference firing.

A victory against the forces of rationalization? After all, when Bingham fires people in-person, there is something of a human touch. But the film undercuts that thesis as well, with another character, a woman professional, also a road warrior, Alex Goran (played by Vera Farmiga). Goran is attractive and warm, but at base is even more mercenary than Bingham: She too lives in the air, has impressive frequent flyer tallies, and is in all the premium classes that one can aspire to (car rental, hotel, airline, so forth).

Bingham is impressed, having finally met his female match (she quips: “I’m you with a vagina”), finds her in hotels across the country for sex appointments, falls in love with her, finds his heart, and is badly jilted in the end (Goran is married, although she had never revealed this to Bingham). And while he may be badly hurt, she is sincerely puzzled that he failed to understand their unspoken contract: Why, he was part of her rationalized system – husband and family in Chicago, fulfilling career, and Bingham for pleasure on the road.

One of the nice twists of the film is that the female character is a more highly evolved Weberian being than are the men: She has a seemingly happy life – she is content, not alienated or complaining – while Bingham struggles with the rationalization of love, the one aspect of human interaction he apparently thought could not succumb to a culture of calculation. He wasn’t paying for the sex after all; he actually liked her.

While Goran’s character -- a Weberian monster of sorts -- might worry us, she underscores a central problem with the rationalization thesis in an age of social networking, texting, and air travel. Weber and his followers did not foresee the humanization of technology that we see now, and I too have been slow to come to this. For years I taught my students about Weber’s iron cage; they understood it and they appreciated it. They understood how the ATM – for all its efficiencies – lessens human interaction (you’ll not meet anyone in a long bank line these days). They understood what is lost when poll results stand in for qualitative opinion expression, or how a phone call is essentially less human than a face-to-face interaction. The tension between progress and human connectedness – that it was a tradeoff, in fact – seemed to make good sense.

But I struggle to hold up my side of the argument these days. Students insist that their connectedness with friends and strangers, through communication technology, is real, fulfilling, fun, sincere, and intimate (e.g., “sexting”). Weber and I are dinosaurs who have no room in our theorizing for the highly social, extraordinarily human interaction that the Internet has enabled. Technology itself, the force we feared would crush the human spirit, turns out to enhance it.

Or so our students argue. We go round and round on this. And perhaps even those of us who have wrapped much of our intellectual existence around theorists like Weber will see the light, and treat those theories as important, but entirely historically-bound. Up in the Air passes no judgment on Goran’s lifestyle, and in fact, she may be the Übermensch. She controls her destiny and she directs the rationalization of her emotional life. While world-weary (a lot of airport bars, a lot of men), she has found her happiness, while Bingham remains a pathetic, troubled amateur.

Up in the Air encourages a revision of some Weberian views, but also takes on some of our mid-20th century sociological giants as well. Robert Merton, working in the tradition of Tönnies and Weber, argued that the dominant media of his day – radio – had produced what he called pseudo-Gemeinschaft or the "feigning of personal concern and intimacy with others in order to manipulate them the better," for profit, typically. Whether it’s selling war bonds (he wrote on Kate Smith’s campaign) or the perpetual fake-friendly "it’s a pleasure to serve you" we hear constantly, Merton was bothered by the niceties layered atop brute business motive. Is it their pleasure or not? Do they sincerely like to serve us, or do they get points for it on a performance review?

In Up in the Air, our protagonist – thanks to his frequent flying – gets the special "personal" treatment from airline professionals and others. He knows it’s fake, but it is still a pleasurable and valued aspect of daily life. When I raise the old Merton argument with my students these days, they are not bothered by it at all, and Reitman sees the niceties much the same way -- as the state of nature in contemporary capitalism, not a repulsive, slavish persona designed by corporate headquarters. When Bingham finally gets his reward for travelling an extraordinary number of miles on the airline – a personal meeting with the top pilot – he is at a loss for words, after imagining the moment a hundred times in his fantasies. Even when we’ve survived the countless niceties and earned the real human touch, it’s not that great after all, another puzzle for our backward hero.

It is far too generous to say that McLuhan was right, that technology has made us more human, brought us together in a global village of understanding, encouraged tolerance of difference, and connected us to our essential, spiritual, primitive and fuller selves. He slips and slides, preaches a bizarre narrative of human history, and ignores social structure and power dynamics as much as possible. But he did, and profoundly so, foresee something of the social networking of today -- how light might shine through what looks like a mechanical, calculating, and cold world of technological progress. Up in the Air sides with McLuhan and with my students: The film gives one answer to a depressed Weber, but my generation -- at least -- feels empty at the end, as we go back up in the air with Clooney.

Author/s: 
Susan Herbst
Author's email: 
info@insidehighered.com

Susan Herbst is chief academic officer of the University System of Georgia and professor of public policy at Georgia Tech.

Andy Warhol, Then and Now

In two weeks, the National Book Critics Circle will vote on this year’s awards, and so, of late, I am reading until my eyes bleed. Well, not literally. At least, not yet. But it is a constant reminder of one's limits -- especially of the brain's plasticity. The ability to absorb new impressions is not limitless.

But one passage in Edmund White’s City Boy: My Life in New York During the 1960s and ‘70s (a finalist in the memoir category, published by Bloomsbury) did leave a trace, and it seems worth passing along. The author is a prominent gay novelist who was a founding member of the New York Institute for the Humanities. One generation’s gossip is the next one’s cultural history, and White has recorded plenty that others might prefer to forget. City Boy will be remembered in particular for its chapter on Susan Sontag. White says that it is unfortunate she did not win the Nobel Prize, because then she would have been nicer to people.

But the lines that have stayed with me appear earlier in the book, as White reflects on the cultural shift underway in New York during the 1960s. The old order of modernist high seriousness was not quite over; the new era of Pop Art and Sontag's "new sensibility" had barely begun.

White stood on the fault line:

"I still idolized difficult modern poets such as Ezra Pound and Wallace Stevens," he writes, "and I listened with uncomprehending seriousness to the music of Schoenberg. Later I would learn to pick and choose my idiosyncratic way through the ranks of canonical writers, composer, artists, and filmmakers, but in my twenties I still had an unquestioning admiration for the Great -- who were Great precisely because they were Great. Only later would I begin to see the selling of high art as just one more form of commercialism. In my twenties if even a tenth reading of Mallarmé failed to yield up its treasures, the fault was mine, not his. If my eyes swooned shut while I read The Sweet Cheat Gone, Proust's pacing was never called into question, just my intelligence and dedication and sensitivity. And I still entertain those sacralizing preconceptions about high art. I still admire what is difficult, though I now recognize it's a 'period' taste and that my generation was the last to give a damn. Though we were atheists, we were, strangely enough, preparing ourselves for God's great Quiz Show; we had to know everything because we were convinced we would be tested on it -- in our next life."

This is a bit overstated. Young writers at a blog like The New Inquiry share something of that " 'period' taste," for example. Here and there, it seems, "sacralizing preconceptions about high art" have survived, despite inhospitable circumstances.

White's comments caught my bloodshot eye because I had been thinking about Arthur C. Danto's short book Andy Warhol, published late last year by Yale University Press. (It is not among the finalists for the NBCC award in criticism, which now looks, to my bloodshot eye, like an unfortunate oversight.)

It was in his article “The Artworld,” published in The Journal of Philosophy in 1964, that Danto singled out for attention the stack of Brillo boxes that Warhol had produced in his studio and displayed in a gallery in New York. Danto maintained that this was a decisive event in aesthetic history: a moment when questions about what constituted a piece of art (mimesis? beauty? uniqueness?) were posed in a new way. Danto, who is now professor emeritus of philosophy at Columbia University, has never backed down from this position. He has subsequently called Warhol “the nearest thing to a philosophical genius the history of art has produced.”

It is easy to imagine Warhol's response to this, assuming he ever saw The Journal of Philosophy: “Wow. That’s really great.”

Danto's assessment must be distinguished from other expressions of enthusiasm for Warhol's work at the time. One critic assumed that Warhol's affectlessness was inspired by a profound appreciation for Brecht’s alienation effect; others saw his paintings as a radical challenge to consumerism and mass uniformity.

This was pretty wide of the mark. The evidence suggests that Warhol’s work was far more celebratory than critical. He painted Campbell’s soup cans because he ate Campell’s soup. He created giant images based on sensational news photos of car crashes and acts of violence -- but this was not a complaint about cultural rubbernecking. Warhol just put it into a new context (the art gallery) where people would otherwise pretend it did not exist.

“He represented the world that Americans lived in,” writes Danto in his book, “by holding up a mirror to it, so that they could see themselves in its reflection. It was a world that was largely predictable through its repetitions, one day like another, but that orderliness could be dashed to pieces by crashes and outbreaks that are our nightmares: accidents and unforeseen dangers that make the evening news and then, except for those immediately affected by them, get replaced by other horrors that the newspapers are glad to illustrate with images of torn bodies and shattered lives.... In his own way, Andy did for American society what Norman Rockwell had done.”

It seems like an anomalous take on an artist whose body of work also includes films in which drag queens inject themselves with amphetamines. But I think Danto is on to something. In Warhol, he finds an artistic figure who fused conceptual experimentation with unabashed mimeticism. His work portrays a recognizable world. And Warhol’s sensibility would never think to change or challenge any of it.

Chance favors the prepared mind. While writing this column, I happened to look over a few issues of The Rag, one of the original underground newspapers of the 1960s, published in Austin by students at the University of Texas. (It lasted until 1977.) The second issue, dated October 17, 1966, has a lead article about the struggles of the Sexual Freedom League. The back cover announces that the Thirteenth Floor Elevators had just recorded their first album in Dallas the week before. And inside, there is a discussion of Andy Warhol’s cinema by one Thorne Dreyer, who is identified, on the masthead, not as the Rag’s editor but as its “funnel.”

The article opens with an account of a recent showing, of the 35-minute film Warhol film “Blow Job” at another university. The titular action is all off-screen. Warhol's camera records only the facial expressions of the recipient. Well before the happy ending, a member of the audience stood up and yelled, “We came to get a blow job and we ended up getting screwed.” (This anecdote seems to have passed into the Warhol lore. I have seen it repeated in various places, though Danto instead mentions the viewers who began singing “He shall never come” to the tune of the civil-right anthem.)

Dreyer goes on to discuss the recent screening at UT of another Warhol film, which consisted of members of the artist's entourage hanging out and acting silly. The reviewer calls it “mediocrity for mediocrity’s sake.” He then provides an interpretation of Warhol that I copy into the digital record for its interest as an example of the contemporary response to his desacralizing efforts -- and for its utterly un-Danto-esque assessment of the artist's philosophical implications.

“Warhol’s message is nihilism," writes Dreyer. "Man in his social relations, when analyzed in the light of pure objectivity and cold intellectualism, is ridiculous (not absurd). And existence is chaos. But what is this ‘objectivity’? How does one obtain it? By not editing his film and thus creating ‘real time’? By boring the viewer into some sort of ‘realization’? But then, is not ‘objectivity’ just as arbitrary and artificial a category as any other? Warhol suggests there is a void. He fills it with emptiness. At least he is pure. He doesn’t cloud the issue with aesthetics.”

And so the piece ends. I doubt a copy ever reached Warhol. It is not hard to imagine how he would have responded, though: “It gives me something to do.” The line between nihilism and affirmation could be awfully thin when Warhol drew it.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Parental Reality Check

"I want to major in the arts."

This coming from my son, a product of two artists, was no surprise, yet the impact of those words jolted me from artist to parent in a matter of seconds. I always prided myself as being a balanced artist and parent; in the theater on the day of delivery, back in the dance studio with baby in tow days later (after a cesarean section no less), not missing a rehearsal or a parent-teacher conference. I advocate for the importance of arts in education and against the severe budget cuts the arts are currently faced with from the perspective as both art educator and parent. Why then, do these seven words throw me into such a tailspin? Where will he work? How will he survive? The funding isn’t there now; what will it be like in four years when he graduates? Is he prepared for this ever-changing artistic world?

As I begin to breathe and justify my reaction, I am faced with a reality. My son has experienced with me the highs and lows of being an artist and the constant justifications I need to make for dance programming, the lack of funds and the frustration of the lack of support. Yet through living this life he still wants to go into the arts. Don’t misunderstand my concern; I am not disappointed by any means. I am very proud and excited for him that he has chosen this path.

Teaching at a women’s college, I speak to many parents about their daughters wanting to be dance majors, reassuring them that it will be O.K.; I advocate for a liberal arts education where a student can major in the art of her choice and be able to double with something "else." The "else" has quickly become, to me, something "solid." I understand the value of an education in the arts and the strong, positive impact the arts have on society. A college major in the arts provides an opportunity to acquire strong creative thinking skills that will enhance learning across disciplines and a comprehensive study that students will apply the rest of their lives.

I am now living what I preach and the mom in me fears that my son’s undying passion for his art may not be able to support him. On top of all that he tells us he wants to go to study at an arts conservatory, not a liberal arts college. This means minimal to no opportunity for the double major. I put other parents’ minds at ease by telling them their daughter will find success majoring in the arts. Who is easing my mind? Is this hypocrisy? I am now on the other side of the desk.

At the risk of sounding partial, I have always been proud of my son's nature to love life and desire to learn everything about everything. He never hesitates to research what he does not know and excitedly shares his discoveries. He and I will often have conversations about how to synergize his findings with my choreography. His innate ability to think as an interdisciplinary artist is fascinating to me. Where did this derive from? How can he apply this interdisciplinary thought process as a tool for his major?

I quickly discover that he thinks through the liberal arts. It is this synergy that he unconsciously created within him that will guide his process. He is my best lesson in learning how to be an artist in a liberal arts environment. An arts education within a liberal arts setting nourishes interdisciplinary artistic opportunities. Will he achieve this at an conservatory? Art conservatories produce fabulous visual artists. I'm not quite sure that such an intense and narrowly focused program is the right fit for him.

I refer to interdisciplinary art as a collaborative method or perspective among several disciplines; my most immediate experiences combine my choreography with visual art, literature, drama, sociology and feminist studies. Interdisciplinary art, however, is not limited to specific genres of art. Teaching in a liberal arts community has provided me with an opportunity to experience an interdisciplinary approach to curriculum between my dance program and other departments on campus including but not limited to art, music, theatre, psychology, the humanities, social sciences and natural sciences. I have witnessed interdisciplinarity among other departments as well, outside of the arts. While this interdisciplinary approach provides multiple outlets for creativity for students and faculty, it also fosters a new vision of the arts, one that peers between the lines and opens communication between art forms as well as between art and academic studies.

As the waiting for college letters commenced, my son had his heart set on a conservatory program as his first choice. Keeping the door of possibilities open, I delicately broached the subject of my realization about him being innately grounded in the liberal arts. His way of thinking and his developing artistic process appeared to crave the interaction of many disciplines. He quietly listened and did not respond. I walked away hoping he was being reflective after my mini-lecture rather than politely ignoring. After many restless nights, after treading on eggshells around the subject, and after all letters were received, he chose to attend a liberal arts college rather than his original intention of a conservatory.

He shared with me that he worried this may pose some challenges for him in developing his technical processes; he was also concerned that he may not fit in. You see, I affectionately refer to him as our vagabond. He wanders, on foot, or bike, throughout the area we live in looking for opportunities to meet new people and draw fascinating things. Material possessions are low on his list of priorities. He lives each day as it comes. Will he fit into an environment that is not entirely filled with other young artists just like him? When will my internal tug-of-war end?

Why did he choose a liberal arts college? After many weeks of weighing the options, he decided that at a liberal arts college he would be exposed to many influences that allow for more subjective and contextual stimulation. His first choice was housed within a large university. An excellent program, no doubt; however, they were not keen on him double-majoring. His love of literature and anthropology needed to take a backseat and he wasn’t too sure he wanted that to happen. Now there is the opportunity for the other major of something solid.

He is currently mid-semester freshman year and finding himself questioning the true meaning of liberal arts. Although the college professes its liberal art values, he has found many students to be quite stagnant, fearful of exploration across disciplines. My son is bouncing back and forth with his second major (beyond an arts major) as being either English or anthropology. He has concluded that this decision would be based on what allows him the most room for artistic growth.

My son has given me a gift. His interdisciplinary way of thinking has provided me with an intellectual and artistic opportunity to further my development as a lifelong learning artist. Joining the forerunners in the dance field that recognize the potential of dance as an interdisciplinary art actively engages me in authentic learning and discussion which contributes to the core competencies that new generations of dancers should have. Robert Diamond documents these core competencies as communicating, problem solving, critical thinking, interpersonal skills, the appreciation of diversity and the ability to adapt to innovation and change.

The artistic process and creation, analytical thinking, and the integration of dance into other disciplines are foremost in my philosophy in the classroom and studio. I challenge my students and encourage them to explore all dance-related avenues of learning to broaden their perspectives of dance as an intellectual art form. As a motivated artist and educator I strive to work toward advancing my knowledge of the future of dance by continuing my education in an environment that promotes higher levels of standards for artistic education and research.

In the ‘80s, the movement and visual art worlds grew apart. Everyone was out for themselves trying to find monies to create. Shared venues between artists that encouraged dialogue among the arts became a thing of the past. Meanwhile, dance was trying to find a solo voice that was appreciated and viewed as a respected art form. My son is now entering an artistic world that has been enduring a tug of war with politics for the past nine years. He personally experienced this after working diligently on his portfolio submission to the Pennsylvania Governor’s School for the Arts. After waiting patiently for a response to his submission, he had the rug pulled out from under him. During the week the admission letters were supposed to be sent out, he was told by his school guidance counselor that funding for the school had been cut with the budget changes.

It is time to transfer into the 21st century and strongly merge artistic efforts with other artistic disciplines. Text, media, art; cross-discipline of art forms may open up more opportunities for funding in the 21st century. Dance is beginning to close the gap between the performance and the visual; to reintroduce itself to the other creative arts. Breaking down these disciplinary categories helps those looking for funding.

My son admitted to me that had he chosen an art conservatory, the study may have been too narrow. While a conservatory may have offered him more technical aspects necessary for a student artist, he has found that at a liberal arts college he is receiving the breadth that is necessary for artistic, creative and personal growth. His list of new friends spans the liberal arts academic choices. He can apply everything he is learning from this new environment to his art.

Having peeled back the parental layers to reach my artistic self I found a calming reassurance that my son will be just fine. How interesting that through this my son is the one that taught me the lesson. Yes, being an art major will open his eyes to the world in a way that he has not viewed it before. Yes, double-majoring with something “else” will give him an opportunity to merge his thoughts from discipline to discipline and communicate his new findings to the world. It is not hypocrisy. I am not leading my son or my students astray. I will watch my students grow, along with my son, as educated artists. He will be fine and will flourish as the interdisciplinary artist he is already becoming. It’s time to let go and let him experience. As he so delicately wrote me this past Mother’s Day, "through my individual growth, isolation, stubbornness, mistakes, choices, arguments, beliefs and lifestyle, which are all going to change faster than you can keep up, just know I love you."

Author/s: 
Robin Gerchman
Author's email: 
info@insidehighered.com

Robin Gerchman is assistant professor and director of dance at Cedar Crest College.

Photography and Political Violence

In Mark Twain’s bitter satire King Leopold’s Soliloquy (1905), the Belgian monarch recalls how much easier it was to control public opinion in the old days. Now all that anyone talks about are the atrocities in the Congo -- where the rubber and ivory trade have been very profitable for the king and his cronies, thanks to the absolute enslavement of the Congolese. “I have spent millions to keep the press of two hemispheres quiet,” he rants, “and still these leaks keep on occurring.”

His most vexing problem, it turns out, is a new and highly mobile bit of technology: “The Kodak has been a sore calamity to us. The most powerful enemy indeed…. The only witness I have encountered in my long experience I couldn’t bribe.” Photographs of mutilated Africans -- their hands cut off for the least infraction, and sometimes just for the hell of it -- were ruining Leopold’s good name as a humanitarian.

Trust that photojournalism gives reliable and virtually unmediated access to the truth has taken some hits over the intervening century. But in The Cruel Radiance: Photography and Political Violence (University of Chicago Press), Susie Linfield, director of the cultural reporting and criticism program at New York University, holds fast to Twain’s optimism about the power of images of suffering to create enormous moral and political effects. It was named a finalist in criticism for the National Book Critics Circle awards; my short essay on it appeared at the NBCC blog Critical Mass, which announced the winners in all categories last week.

We met briefly at the awards ceremony, and over the weekend Linfield responded to a series of questions. The following interview is drawn from that exchange.

Q: People used to write defenses of poetry. Your book opens with a defense of photography, and of photojournalism in particular -- particularly against certain strains of photography criticism. Is that really so urgent? Have polemics against photography ever had any effect on anyone? Susan Sontag's critique in On Photography may have been harsh, for example, but she collected photos, and kept on sitting for portraits.

A: Well, there are different kinds of urgency. I wouldn't say my defense of photojournalism -- and of photographic truth -- is as urgent as, say, stopping mass rape in the Congo, or as protecting Libyans from the madness of Qaddafi. But yes, I think that the attack on photojournalism -- Sontag was most prominent exponent of this, but the critique goes back to the Frankfurt School critics and forward to the postmodernists -- has given us too many alibis, too many excuses. It's very, very easy to simply not look at certain kinds of photographs, and therefore to not consider the phenomenological experience of certain kinds of violence. And, moreover, to feel virtuous in not-looking, since we've been told over and over that photographs exploit, manipulate, seduce, mislead, oppress, commodify... Even a teenager now can glibly tell you, "All photographs lie" or "There is no such thing as truth." But neither of those statements is accurate.

Q: You define your approach, not just against certain currents in photography criticism, but in continuity with other work -- James Agee's and Pauline Kael's writing on movies, for one, and Kenneth Tynan's on theater. Would you say more about that? And is there really no "usable past" in photography criticism itself you can draw on?

A: Yes, there is a wonderful "usable past" in photography criticism: including, certainly, Sontag, John Berger, Roland Barthes, Siegfried Kracauer, Walter Benjamin, and Brecht. The fact that I have criticisms of all these writers doesn't mean that I don't also think they've done invaluable, indeed brilliant, work. But what most photography critics lack (though Benjamin is actually an exception to this) is a passion for the form itself. And it is this passion for -- this cathection to -- the form that animates critics like Agee and Kael vis à vis the movies, and Tynan for the theater. It was also the animating force for the young critics who came of age in the mid-1960s and began writing about rock music: Ellen Willis, Greil Marcus, Robert Christgau, James Miller. Those music critics had read a lot of theory and history and criticism, and they were all highly analytic. But they also considered themselves part of the mass audience -- and of the larger counterculture -- in ways that many photography critics simply haven't. They weren't populists, but they were democrats, and -- like Kael -- they were highly invested in the question of what a democratic culture of excellence might look like.

In his book The Company of Critics, Michael Walzer argues for the importance of the organic critic: the critic who considers herself a part of the society that she critiques. He cites a wide range of examples, from the Hebrew prophets of the Old Testament to George Orwell and Antonio Gramsci. It is this kind of organic criticism that many photography critics scorn, or at least avoid. They start from a position of suspicion toward, not love for, photography -- and, sometimes, from a position of contempt for the general audience.

This is in part why the language of photography criticism -- I am thinking of the postmoderns now -- is often so clunky, even ugly. But to read Kael or Agee is a joy. They weren't writing about "the enemy," which is, alas, the stance of some photography critics. Look at Agee's reviews of Preston Sturges's "The Miracle of Morgan's Creek," or of Olivier's version of "Henry V," and you'll see what I mean.

Q: Can a photojournalistic image of atrocity have aesthetic interest? Should it? It would be one thing if Stuart Freedman's photo on page 146 -- showing a child in Sierra Leone sitting in an otherwise empty room, looking at his father's detached prosthetic limbs -- were the work of a surrealist artist. But to find it beautiful, as I did until reading the caption, seems pretty horrifying.

A: Yes, such pictures can -- and do -- have aesthetic interest, I think. There's no getting around that: photographs are aesthetic objects. They are a documentation of something; they are not the thing-in-itself. What makes photographs so bewildering, and so bothersome, and so discomfiting, is that they record something that actually happened, and at the time it actually happened (unlike other aesthetic objects, such as paintings and sculptures).

Lots of people hate the idea that photographs of violence and suffering can be beautiful -- and by beauty I mean aesthetically compelling. But of course they can be. So, for that matter, can literature, including nonfiction literature, that documents violence and cruelty (think of Primo Levi, though one can easily come up with many other examples). Is Paul Celan's "Deathfugue" a beautiful poem? It is, although the beauty is quite terrible.

I think that people often feel guilty looking at visually powerful, formally accomplished photographs of war and atrocity; hence the vitriolic critiques of Gilles Peress, James Nachtwey, and others. But the formal power of their photographs is, precisely, part of what allows them to convey the experience of suffering; and to convey it in ways that make me, at least, think harder and deeper about what they are showing. The guilt that some viewers feel when looking at these photographs is, I think, misplaced -- and rather narcissistic to boot.

And the truth of the matter is that even in the world's worst situations, beauty -- that is, visual power, grace, dignity, formal coherence -- exists. In 1944 -- a very bad year -- Czeslaw Milosz wrote a poem in which he said that the scent of a flowering tree "is like an insult/To suffering humanity..." And so it is. But I think we just have to live with this contradiction. The alternative -- to make messy, visually incoherent photographs -- makes no sense, and would do absolutely nothing for the victims.

Q: Sometimes photography does not simply document political violence but participates in it. The Cruel Radiance discusses several examples of this -- pictures of atrocity taken by Nazis, mug shots of Khmer Rouge prisoners taken at a torture center, and the digital snapshots from Abu Ghraib, among others. At one point you contrast the photojournalist's "ethics of showing" with the "ethics of seeing" incumbent upon viewers of images of political violence. But what are the terms of such an ethics of seeing when the act of taking a photo is meant to degrade and dehumanize?

A: I think these are the most difficult photographs to contemplate -- or to know how to contemplate. There is no doubt that there is there are times and circumstances when photography itself becomes as an act of cruelty: we see this with thousands of Nazi photographs, the Abu Ghraib photos, and many others. Among the most egregious contemporary examples are the many torture/beheading videos made by Islamic terrorist groups (the video of Daniel Pearl's murder is most famous, but there are, alas, many others).

There is no good way, or pure way, to look at such photos or videos or films. And I think everyone has their breaking point: for some it might be some of the Nazi photos, for others the beheading films. (I myself have never looked at the latter.) On the other hand, even the most horrific photos can be, and have been, used in ways their makers never intended. During World War II, for instance, the Polish Underground, Jewish partisans, and the Soviets flooded the Western media with photographs of Nazi atrocities that had been taken by Nazi soldiers; the anti-fascists wanted the world to know what was happening, and most of the documentation of Nazi barbarism came from the Nazis themselves. Alas, few of these photos were printed by Western newspapers at the time -- they were regarded as Jewish or Soviet "propaganda," and therefore as untrustworthy. But the point is that photographs can be used in ways their makers never intended. We can subvert the intent of the perpetrators.

A recent example of this is a series of four photographs taken last year by a Somali photographer for the AP named Farah Abdi Warsameh. They show, in gruesome detail, the stoning to death -- for the crime of adultery -- of a Somali man, by the Islamist militia Hizbul Islam. The photographs are very controversial: among other things, they could not possibly have been taken without the permission of Hizbul Islam. And I have no doubt that Hizbul Islam is circulating these photos -- which are truly disgusting -- with pride: they are propaganda of the deed. But I also have no doubt that Warsameh took them with other motives in mind (I've seen other examples of his work). And I think we should look at them, hard as that may be: they show what Shariah law looks like in practice. I should add that Shariah is now legal in Somalia -- which means that what we are looking at, up close, is "justice," Islamist-style.

Q: I have to question your formulation here. Treating Shariah law as some kind of homogeneously vicious thing is simply wrong -- there are reactionary forms of Shariah, and modernizing forms. Saying this is one way to get both Islamicists and Islamophobes mad at you, of course.

A: It's possible to have Shariah law that doesn't condone, or legalize, stonings. But I don't think there is such a thing as a truly modernized Shariah, because I don't believe the rule of law can be based in religious texts. (Ask women in Iran about this.) And the point is that, in the places where the introduction, or reintroduction, of Shariah is being debated (such as in Afghanistan, as part of a possible deal with the Taliban), the form that will be instituted won't be too modern, or permissive, or tolerant. Nor have I ever seen any form of Shariah that, in practice, does not discriminate against women.

My point about the Somali photos, though, is that: this is what Shariah looks like in practice -- or at least in too many practices -- and we should look at it. Debates about this are often rather theoretical, or based on "could be's" (as in, "Shariah could be modernized"). What we see here is not theoretical at all, nor is it a rare exception.

Q: Is there a particular image of political violence that you've found impossible to come to terms with -- to recover from viewing?

A: I'm not sure I've "come to terms" with any of the photographs in my book; I don't think they can be "mastered" (in much the way that Adorno wrote that Germans could not possibly "master" the reality of Auschwitz). For me, the hardest photographs are not those that actually depict violence, but those that depict its preview or aftermath: that show the victims before they were victims, or at least before they were dead victims.

There's a photograph in my book taken by Mendel Grossman, a Jewish photographer who was imprisoned in the Lodz Ghetto (he died on a death march at the very end of the war). It's an "underground" photo, i.e., taken surreptitiously. It shows two women kissing on the mouth -- their lips pressed together through a mesh fence -- before one of them is deported to a death camp. I have a lot of trouble recovering from that. Similarly, the photograph of the girl on the cover of my book -- a Cambodian child, photographed before execution (and probably torture) in a Khmer Rouge "prison" -- is very hard for me to look at, and very hard for me to look away from.

I feel that I owe her -- what? life, safety, salvation -- yet I am acutely aware that I can do exactly nothing. We look at her as she looks at us: but we are way too late. Even worse: when we were not too late, we did nothing. This is a very calm, serene, sober photograph -- with no overt violence whatsoever -- but it is a very powerful J'accuse.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Arts
Back to Top