Brandeis Wasn't Wrong

In 2001 I donated my collection of prints by sculptors to the Block Museum of Art at Northwestern, though some of the prints still adorn the walls of my house and won’t get to Evanston until after my death. You can assume -- and you would be right -- that a collector of such works has been a lifetime “consumer” and supporter of the arts.

And yet, I said to myself “good for them” when reports first surfaced last winter that Brandeis intended to sell its collection of modern art, so that the considerable (envisaged) proceeds could support functions closer to the central goals of the university.

Understand that my print collection went to Northwestern because I had been dean of arts and sciences there for thirteen years. Understand also that regarding this issue, my experience as dean trumps my love of art and that is why I disagree with the views expressed in numerous articles in The New York Times and one this month in Inside Higher Ed called “Avoiding the Next Brandeis."

I see a significant role for art museums on higher education campuses. But, with quite special exceptions, I see a very small pedagogic function for colleges and universities to own works of art, especially given the current cost and value of so many of them. I’d rather those museums were reclassified as galleries. To be sure, the provisions of deeds of gift must be scrupulously observed; but assuming that to be the case, let them sell their works of art if the funds thus gained will better serve the institutions’ educational mission.

The premise here is that the roles of museums on campuses are not like those of museums downtown, since the former exist to serve the specific needs and interests of a campus’s students and faculty.

This month’s article in Inside Higher Ed quotes a task force formed by arts groups to figure out ways to avoid the next Brandeis as saying that campus museums should be regarded as “essential to the academic experience and to the entire educational enterprise.”

But why should they be so regarded when, by my admittedly not systematic observations, most of those museums do nothing or very little to deserve to be so regarded? As dean, I had to bludgeon the Block Gallery to present an exhibit of the work of Northwestern’s prize painters, William Conger, Ed Paschke and James Valerio. (This was before the Gallery was transformed into a Museum and long before its current director, David Robertson, came to Northwestern.) Art history departments are mostly held at arm's length by campus museums who prize their (inappropriate) autonomy. Mostly, the museums don’t even know how to communicate with other than art faculty on campus.

It is excellent, therefore, that this cluster of issues is being looked at. In my view, however, the goals sought by the task force for campus art museums are not likely to be realized by means of works of arts owned by museums, but rather by means of exhibits brought in and often locally curated for specific pedagogic purposes.

Members of the task force, make sure, therefore, that you are not just talking to yourselves. You are looking for ways to relate A to B; there must thus be strong representation from both poles. As announced, the organizations participating in the task force are mostly from the Category A: the art museum community.

I strongly recommend that it also include not only representation from the art history and studio art departments, but knowledgeable people who have thoughts about how to involve art museums in educating students who are not primarily concerned with the arts. Indeed, given the way in which so many campus museums lead existences so separate from their campus surroundings, it might even be necessary to initiate reflection about about their possible wider functions. The task force might want to consider forming a committee consisting of a couple of department chairperson, a couple of deans or associate deans, perhaps some interested students assigning them the task of reporting to the museum-powers-that-be how those museums might serve a broad campus constituency.

Accordingly, if the just-formed task force keeps its eye on the ball (as I see it), that Brandeis bomb will have very positive, if unintended, consequences.

Rudolph H. Weingartner
Author's email:

Rudolph H. Weingartner is former dean of arts and sciences at Northwestern University.

The Museum at the Heart of the Academy

When I first learned last winter that the Brandeis University trustees had voted to sell the collections of the Rose Art Museum and close the museum, my reactions were many: concern for Brandeis students who were losing an important learning tool; sadness that a great university was breaking trust with many benefactors; annoyance that the museum industry would be yet again living the trauma of defending our collections as other than semi-liquid assets.

To these were added a suspicion that somehow the Rose must have failed in its campus-wide engagement and in its outreach to key campus constituencies (including its trustees), or those very trustees would never have felt they could make that particular decision, no matter how great the budget gap facing them. Even as I recognized the right of any university to shutter any program no longer deemed sufficiently important, I shuddered at such a reactive decision.

Nowhere in my response did I consider the “good for them” proclamation made by Rudolph Weingartner in his Views column of October 23 for Inside Higher Ed, arguing against both the pedagogic value of owning works of art and the effectiveness of university museums generally. Most troublingly, in reading of the view of a panel of experts arguing that university museums should be regarded as “essential to the academic experience," Dean Weingartner observes “by my admittedly not systematic observations, most of those museums do nothing or very little to deserve to be so regarded…. Mostly, the museums don’t even know how to communicate with other than art faculty on campus.”

Drawing such conclusions -- and a kind of pleasure in the demise of a fine museum -- on the basis of random evidence seems not to represent the rigors of academic policy making at its best.

More dangerously, this view fails to note either the sheer range and variety of campus museums in the United States or the extent to which many have worked mightily in recent decades to make themselves central to their parent institutions. Long gone are the days when most university museums could be seen as, at best, the laboratory addendum to a department of art or art history. Seeking not merely (although importantly) to shape future art historians and museum professionals, the nation’s best university museums have long been engaged in the practice of fostering critical thinking and visual literacy, the understanding of times and cultures dramatically distinct from our own, the awareness of a common humanity, and thus, ultimately, the shaping of good citizenship.

Here at Princeton University, we have long crossed boundaries to partner our museum with disciplines and departments from the humanities to creative writing to architecture to civil engineering. The Yale Center for British Art routinely connects with fields ranging from natural history to cultural studies; their exhibition this year on the impact of Darwin’s theory of evolution on subsequent creative practice was a model for cutting-edge investigation of value to us all.

The Wolfsonian Museum at Florida International University offered an almost shockingly timely exhibition looking at the art of propaganda during last year’s presidential campaign. The new wing opened this year at the University of Michigan Museum of Art -- which I led until recently -- was designed to architecturally embody and make possible a commitment to deep campus-wide engagement, providing a second home for programming in performance, creative writing, film, and the humanistic disciplines generally.

And many universities increasingly use their art museums in medical curriculums, having discovered that sustained close looking makes their doctors-in-training better diagnosticians. From Dartmouth, to Emory, to Wisconsin, to UCLA, great university museums have shown themselves deeply capable of being essential to the lives of their universities, even as they also often function as enormously beneficial gateways to those universities for the general public.

The argument that academic museums can do these things is no mere abstraction. They are doing these things, and are increasingly recognized as playing an essential role in a time of bottom-line driven programming at many of even our greatest civic museums. With less at stake in the battle for attendance, the university museum can often take on difficult projects whose popularity cannot be assured, advancing the cause of new knowledge presented in accessible ways that yet seek to avoid pandering or the much dreaded “dumbing down." Many of the first thematic exhibitions -- sometimes operating in the sphere of a social history of art, the so-called “new art history” -- took place in our university museums. Increasingly, and happily, the special role of the university museum is recognized by the media: Writing in The New York Times this year, the art critic Holland Cotter observed “The august public museum gave us fabulousness. The tucked away university gallery gave us life: organic, intimate and as fresh as news.”

And why do we university museums so annoyingly feel the need to collect artworks, creating the inevitable drain on resources caused by those pesky stewardship requirements? I offer in answer a fundamental article of faith, that even in the digital age, the sustained engagement with original works of art necessary for teaching, research, and layered learning would be difficult if not impossible if we ceased to be collecting institutions and instead taught only from objects temporarily made available for exhibition.

In the way that great texts live in our libraries, available for revisiting and sustained scholarly investigation, the works of art in our museums offer the possibility of deep critical engagement, close looking, and technical analysis -- made all the deeper when brought together as collections in which dialogues arise through the conversation of objects with each other and with their scholarly interlocutors. Surely a key role of the academy -- the advancement of new knowledge and the challenging of past knowledge -- is that fruit of curatorial, faculty, and student research made possible by the sustained presence of great works of art, whose survival for the future is also thus (and not incidentally) guaranteed.

Like libraries that often also find themselves embattled in times of budget cuts (since typically neither museums nor libraries directly generate tuition streams), great university art museums are a “public good," offering value and possibility to the whole of our university communities as well as to users from outside the walls of the ivory tower. That all university museums do not achieve this centrality of purpose -- often, I suspect, for lack of adequate resourcing by their parent institutions in the perpetual fight against the perception that art represents a “luxury” in the logo-and data-centric university -- is to be regretted. Without question much work remains to be done to make our museums central to the academic experience.

But just as any academic department desires a certain autonomy to define its foci and particular strengths within the university curriculum, no academic museum should be “bludgeoned” into showing the work of particular artists or serving as the handmaiden of narrow administrative modishness. The academic model has never, thank heavens, been one of pure utility, even as we seek to be responsible, effective, and impactful.

For me, the lesson of the Brandeis debacle is the reminder that the fight for the central role of our museums is not won. Contrary to Dean Weingartner’s views, however, that fight has long and often successfully been underway.

James Christen Steward
Author's email:

James Christen Steward is director of the Princeton University Art Museum.

George Clooney Meets Max Weber

Spoiler alert: Max Weber’s life is an open book, thanks in part to Joachim Radkau’s wonderful new 700-page biography, so nothing to spoil there. But this essay does reveal the ending of Jason Rietman’s new film.

Thoughtful, intellectual movies are produced each year in the United States and abroad -- open texts rich with meaning, understood by critics or not. Some writers and directors begin with a premise, others stumble into one, and still others capture the zeitgeist and hit a chord, even if we cannot articulate precisely what it is. For me, not much of a moviegoer and certainly not a film critic, Up in the Air, the highly-acclaimed new movie directed by Jason Reitman (he also directed Juno), and written with Sheldon Turner, resonates powerfully with some of my challenging student conversations of late.

There are no ground-breaking paradigms about human nature introduced in Up in the Air, just as we’ve not seen many of those in academic circles in recent times. But by trying to keep us engaged, Reitman manages to come face to face with the very best of 19th and early 20th century philosophy and sociology. It was during this period that the great theorists of industrialization and technology emerged with force – Marx of course, then Max Weber, Ferdinand Tönnies, and Emile Durkheim among others – exploring the relationships among rationality, morality, community, and the acceleration of technological change in all aspects of life.

By the end of the 19th century, the horrors of progress began to take hold in the sociological imagination, a theme that persisted into the 20th century through Foucault and his contemporaries. There are the cheerleaders for sure: Marshall McLuhan – brilliant as he was – could see very little dark side to the explosion of electronic media, for instance. And it is difficult to locate the downsides of advances in medicine or public health technologies without undue stretching. But Reitman is some sort of original and popular voice, struggling anew with the complex interface between rapidly-evolving technology (communication, transportation, finance) and human relations. It’s not a bad addition to a syllabus.

Let's start with Weber, the wildly abbreviated version: With regard to technology, progress, and capitalism, Weber detected a linear trend toward increasing rationalization, systematization, and routinization. In all aspects of life -- from the nature of organizations to the structure of religions -- we seek efficiency and system, in order to gain profit, leisure time, and fulfillment. This drive toward increasing organization, in all its manifestations, is too powerful to fight, given its clear appeal and "scientific" grounding.

Yet, Weber notes, all of this seeming success ultimately makes us less human: With increasing rationalization, we lose our collective spirit. He said, famously, that "each man becomes a little cog in the machine and, aware of this, his one preoccupation is whether he can become a bigger cog," a set of insights that drove him to despair. There are, Weber argued, occasional charismatic leaders that shake up our tidy world of rational calculation. But charismatic movements and people succumb to the inevitability of rationalization, crushed by a culture driven to success, results, and materialism. With no way out, Weber posits, we shall find ourselves in an "iron cage" of rationality, and the human heart will be impossible to locate.

To the film: Ryan Bingham (Clooney) is a consultant who spends most of his life on planes and in airports, traveling the nation as a professional terminator. He is with a firm hired by companies to lay off workers face-to-face (so the employer doesn’t have to), hand them a packet with their severance details, and deliver banal bits of inspiration intended to contain any extreme emotional reaction on the part of the unlucky employee. It’s a perfect Weberian job: Bingham produces nothing truly meaningful, keeps the wheels of capitalism spinning, has no personal relations muddying up his work, and makes good money for the firm.

This all goes well for Bingham; he has no interest in settling down (at least at the start of the film), and being in the air keeps his adrenaline pumping. But his firm has even higher ambitions to rationalize their business model, and with the help of a naïve 20-something M.B.A. type, moves to a system where professionals like Bingham can fire people over videoconference, hence saving millions in travel costs. At the end of the film, due to some unhappy results, the new system is pulled back for more study, and Bingham and colleagues get back on the road to once again fire people in person, which has more heart than the videoconference firing.

A victory against the forces of rationalization? After all, when Bingham fires people in-person, there is something of a human touch. But the film undercuts that thesis as well, with another character, a woman professional, also a road warrior, Alex Goran (played by Vera Farmiga). Goran is attractive and warm, but at base is even more mercenary than Bingham: She too lives in the air, has impressive frequent flyer tallies, and is in all the premium classes that one can aspire to (car rental, hotel, airline, so forth).

Bingham is impressed, having finally met his female match (she quips: “I’m you with a vagina”), finds her in hotels across the country for sex appointments, falls in love with her, finds his heart, and is badly jilted in the end (Goran is married, although she had never revealed this to Bingham). And while he may be badly hurt, she is sincerely puzzled that he failed to understand their unspoken contract: Why, he was part of her rationalized system – husband and family in Chicago, fulfilling career, and Bingham for pleasure on the road.

One of the nice twists of the film is that the female character is a more highly evolved Weberian being than are the men: She has a seemingly happy life – she is content, not alienated or complaining – while Bingham struggles with the rationalization of love, the one aspect of human interaction he apparently thought could not succumb to a culture of calculation. He wasn’t paying for the sex after all; he actually liked her.

While Goran’s character -- a Weberian monster of sorts -- might worry us, she underscores a central problem with the rationalization thesis in an age of social networking, texting, and air travel. Weber and his followers did not foresee the humanization of technology that we see now, and I too have been slow to come to this. For years I taught my students about Weber’s iron cage; they understood it and they appreciated it. They understood how the ATM – for all its efficiencies – lessens human interaction (you’ll not meet anyone in a long bank line these days). They understood what is lost when poll results stand in for qualitative opinion expression, or how a phone call is essentially less human than a face-to-face interaction. The tension between progress and human connectedness – that it was a tradeoff, in fact – seemed to make good sense.

But I struggle to hold up my side of the argument these days. Students insist that their connectedness with friends and strangers, through communication technology, is real, fulfilling, fun, sincere, and intimate (e.g., “sexting”). Weber and I are dinosaurs who have no room in our theorizing for the highly social, extraordinarily human interaction that the Internet has enabled. Technology itself, the force we feared would crush the human spirit, turns out to enhance it.

Or so our students argue. We go round and round on this. And perhaps even those of us who have wrapped much of our intellectual existence around theorists like Weber will see the light, and treat those theories as important, but entirely historically-bound. Up in the Air passes no judgment on Goran’s lifestyle, and in fact, she may be the Übermensch. She controls her destiny and she directs the rationalization of her emotional life. While world-weary (a lot of airport bars, a lot of men), she has found her happiness, while Bingham remains a pathetic, troubled amateur.

Up in the Air encourages a revision of some Weberian views, but also takes on some of our mid-20th century sociological giants as well. Robert Merton, working in the tradition of Tönnies and Weber, argued that the dominant media of his day – radio – had produced what he called pseudo-Gemeinschaft or the "feigning of personal concern and intimacy with others in order to manipulate them the better," for profit, typically. Whether it’s selling war bonds (he wrote on Kate Smith’s campaign) or the perpetual fake-friendly "it’s a pleasure to serve you" we hear constantly, Merton was bothered by the niceties layered atop brute business motive. Is it their pleasure or not? Do they sincerely like to serve us, or do they get points for it on a performance review?

In Up in the Air, our protagonist – thanks to his frequent flying – gets the special "personal" treatment from airline professionals and others. He knows it’s fake, but it is still a pleasurable and valued aspect of daily life. When I raise the old Merton argument with my students these days, they are not bothered by it at all, and Reitman sees the niceties much the same way -- as the state of nature in contemporary capitalism, not a repulsive, slavish persona designed by corporate headquarters. When Bingham finally gets his reward for travelling an extraordinary number of miles on the airline – a personal meeting with the top pilot – he is at a loss for words, after imagining the moment a hundred times in his fantasies. Even when we’ve survived the countless niceties and earned the real human touch, it’s not that great after all, another puzzle for our backward hero.

It is far too generous to say that McLuhan was right, that technology has made us more human, brought us together in a global village of understanding, encouraged tolerance of difference, and connected us to our essential, spiritual, primitive and fuller selves. He slips and slides, preaches a bizarre narrative of human history, and ignores social structure and power dynamics as much as possible. But he did, and profoundly so, foresee something of the social networking of today -- how light might shine through what looks like a mechanical, calculating, and cold world of technological progress. Up in the Air sides with McLuhan and with my students: The film gives one answer to a depressed Weber, but my generation -- at least -- feels empty at the end, as we go back up in the air with Clooney.

Susan Herbst
Author's email:

Susan Herbst is chief academic officer of the University System of Georgia and professor of public policy at Georgia Tech.

Andy Warhol, Then and Now

In two weeks, the National Book Critics Circle will vote on this year’s awards, and so, of late, I am reading until my eyes bleed. Well, not literally. At least, not yet. But it is a constant reminder of one's limits -- especially of the brain's plasticity. The ability to absorb new impressions is not limitless.

But one passage in Edmund White’s City Boy: My Life in New York During the 1960s and ‘70s (a finalist in the memoir category, published by Bloomsbury) did leave a trace, and it seems worth passing along. The author is a prominent gay novelist who was a founding member of the New York Institute for the Humanities. One generation’s gossip is the next one’s cultural history, and White has recorded plenty that others might prefer to forget. City Boy will be remembered in particular for its chapter on Susan Sontag. White says that it is unfortunate she did not win the Nobel Prize, because then she would have been nicer to people.

But the lines that have stayed with me appear earlier in the book, as White reflects on the cultural shift underway in New York during the 1960s. The old order of modernist high seriousness was not quite over; the new era of Pop Art and Sontag's "new sensibility" had barely begun.

White stood on the fault line:

"I still idolized difficult modern poets such as Ezra Pound and Wallace Stevens," he writes, "and I listened with uncomprehending seriousness to the music of Schoenberg. Later I would learn to pick and choose my idiosyncratic way through the ranks of canonical writers, composer, artists, and filmmakers, but in my twenties I still had an unquestioning admiration for the Great -- who were Great precisely because they were Great. Only later would I begin to see the selling of high art as just one more form of commercialism. In my twenties if even a tenth reading of Mallarmé failed to yield up its treasures, the fault was mine, not his. If my eyes swooned shut while I read The Sweet Cheat Gone, Proust's pacing was never called into question, just my intelligence and dedication and sensitivity. And I still entertain those sacralizing preconceptions about high art. I still admire what is difficult, though I now recognize it's a 'period' taste and that my generation was the last to give a damn. Though we were atheists, we were, strangely enough, preparing ourselves for God's great Quiz Show; we had to know everything because we were convinced we would be tested on it -- in our next life."

This is a bit overstated. Young writers at a blog like The New Inquiry share something of that " 'period' taste," for example. Here and there, it seems, "sacralizing preconceptions about high art" have survived, despite inhospitable circumstances.

White's comments caught my bloodshot eye because I had been thinking about Arthur C. Danto's short book Andy Warhol, published late last year by Yale University Press. (It is not among the finalists for the NBCC award in criticism, which now looks, to my bloodshot eye, like an unfortunate oversight.)

It was in his article “The Artworld,” published in The Journal of Philosophy in 1964, that Danto singled out for attention the stack of Brillo boxes that Warhol had produced in his studio and displayed in a gallery in New York. Danto maintained that this was a decisive event in aesthetic history: a moment when questions about what constituted a piece of art (mimesis? beauty? uniqueness?) were posed in a new way. Danto, who is now professor emeritus of philosophy at Columbia University, has never backed down from this position. He has subsequently called Warhol “the nearest thing to a philosophical genius the history of art has produced.”

It is easy to imagine Warhol's response to this, assuming he ever saw The Journal of Philosophy: “Wow. That’s really great.”

Danto's assessment must be distinguished from other expressions of enthusiasm for Warhol's work at the time. One critic assumed that Warhol's affectlessness was inspired by a profound appreciation for Brecht’s alienation effect; others saw his paintings as a radical challenge to consumerism and mass uniformity.

This was pretty wide of the mark. The evidence suggests that Warhol’s work was far more celebratory than critical. He painted Campbell’s soup cans because he ate Campell’s soup. He created giant images based on sensational news photos of car crashes and acts of violence -- but this was not a complaint about cultural rubbernecking. Warhol just put it into a new context (the art gallery) where people would otherwise pretend it did not exist.

“He represented the world that Americans lived in,” writes Danto in his book, “by holding up a mirror to it, so that they could see themselves in its reflection. It was a world that was largely predictable through its repetitions, one day like another, but that orderliness could be dashed to pieces by crashes and outbreaks that are our nightmares: accidents and unforeseen dangers that make the evening news and then, except for those immediately affected by them, get replaced by other horrors that the newspapers are glad to illustrate with images of torn bodies and shattered lives.... In his own way, Andy did for American society what Norman Rockwell had done.”

It seems like an anomalous take on an artist whose body of work also includes films in which drag queens inject themselves with amphetamines. But I think Danto is on to something. In Warhol, he finds an artistic figure who fused conceptual experimentation with unabashed mimeticism. His work portrays a recognizable world. And Warhol’s sensibility would never think to change or challenge any of it.

Chance favors the prepared mind. While writing this column, I happened to look over a few issues of The Rag, one of the original underground newspapers of the 1960s, published in Austin by students at the University of Texas. (It lasted until 1977.) The second issue, dated October 17, 1966, has a lead article about the struggles of the Sexual Freedom League. The back cover announces that the Thirteenth Floor Elevators had just recorded their first album in Dallas the week before. And inside, there is a discussion of Andy Warhol’s cinema by one Thorne Dreyer, who is identified, on the masthead, not as the Rag’s editor but as its “funnel.”

The article opens with an account of a recent showing, of the 35-minute film Warhol film “Blow Job” at another university. The titular action is all off-screen. Warhol's camera records only the facial expressions of the recipient. Well before the happy ending, a member of the audience stood up and yelled, “We came to get a blow job and we ended up getting screwed.” (This anecdote seems to have passed into the Warhol lore. I have seen it repeated in various places, though Danto instead mentions the viewers who began singing “He shall never come” to the tune of the civil-right anthem.)

Dreyer goes on to discuss the recent screening at UT of another Warhol film, which consisted of members of the artist's entourage hanging out and acting silly. The reviewer calls it “mediocrity for mediocrity’s sake.” He then provides an interpretation of Warhol that I copy into the digital record for its interest as an example of the contemporary response to his desacralizing efforts -- and for its utterly un-Danto-esque assessment of the artist's philosophical implications.

“Warhol’s message is nihilism," writes Dreyer. "Man in his social relations, when analyzed in the light of pure objectivity and cold intellectualism, is ridiculous (not absurd). And existence is chaos. But what is this ‘objectivity’? How does one obtain it? By not editing his film and thus creating ‘real time’? By boring the viewer into some sort of ‘realization’? But then, is not ‘objectivity’ just as arbitrary and artificial a category as any other? Warhol suggests there is a void. He fills it with emptiness. At least he is pure. He doesn’t cloud the issue with aesthetics.”

And so the piece ends. I doubt a copy ever reached Warhol. It is not hard to imagine how he would have responded, though: “It gives me something to do.” The line between nihilism and affirmation could be awfully thin when Warhol drew it.

Scott McLemee
Author's email:

Parental Reality Check

"I want to major in the arts."

This coming from my son, a product of two artists, was no surprise, yet the impact of those words jolted me from artist to parent in a matter of seconds. I always prided myself as being a balanced artist and parent; in the theater on the day of delivery, back in the dance studio with baby in tow days later (after a cesarean section no less), not missing a rehearsal or a parent-teacher conference. I advocate for the importance of arts in education and against the severe budget cuts the arts are currently faced with from the perspective as both art educator and parent. Why then, do these seven words throw me into such a tailspin? Where will he work? How will he survive? The funding isn’t there now; what will it be like in four years when he graduates? Is he prepared for this ever-changing artistic world?

As I begin to breathe and justify my reaction, I am faced with a reality. My son has experienced with me the highs and lows of being an artist and the constant justifications I need to make for dance programming, the lack of funds and the frustration of the lack of support. Yet through living this life he still wants to go into the arts. Don’t misunderstand my concern; I am not disappointed by any means. I am very proud and excited for him that he has chosen this path.

Teaching at a women’s college, I speak to many parents about their daughters wanting to be dance majors, reassuring them that it will be O.K.; I advocate for a liberal arts education where a student can major in the art of her choice and be able to double with something "else." The "else" has quickly become, to me, something "solid." I understand the value of an education in the arts and the strong, positive impact the arts have on society. A college major in the arts provides an opportunity to acquire strong creative thinking skills that will enhance learning across disciplines and a comprehensive study that students will apply the rest of their lives.

I am now living what I preach and the mom in me fears that my son’s undying passion for his art may not be able to support him. On top of all that he tells us he wants to go to study at an arts conservatory, not a liberal arts college. This means minimal to no opportunity for the double major. I put other parents’ minds at ease by telling them their daughter will find success majoring in the arts. Who is easing my mind? Is this hypocrisy? I am now on the other side of the desk.

At the risk of sounding partial, I have always been proud of my son's nature to love life and desire to learn everything about everything. He never hesitates to research what he does not know and excitedly shares his discoveries. He and I will often have conversations about how to synergize his findings with my choreography. His innate ability to think as an interdisciplinary artist is fascinating to me. Where did this derive from? How can he apply this interdisciplinary thought process as a tool for his major?

I quickly discover that he thinks through the liberal arts. It is this synergy that he unconsciously created within him that will guide his process. He is my best lesson in learning how to be an artist in a liberal arts environment. An arts education within a liberal arts setting nourishes interdisciplinary artistic opportunities. Will he achieve this at an conservatory? Art conservatories produce fabulous visual artists. I'm not quite sure that such an intense and narrowly focused program is the right fit for him.

I refer to interdisciplinary art as a collaborative method or perspective among several disciplines; my most immediate experiences combine my choreography with visual art, literature, drama, sociology and feminist studies. Interdisciplinary art, however, is not limited to specific genres of art. Teaching in a liberal arts community has provided me with an opportunity to experience an interdisciplinary approach to curriculum between my dance program and other departments on campus including but not limited to art, music, theatre, psychology, the humanities, social sciences and natural sciences. I have witnessed interdisciplinarity among other departments as well, outside of the arts. While this interdisciplinary approach provides multiple outlets for creativity for students and faculty, it also fosters a new vision of the arts, one that peers between the lines and opens communication between art forms as well as between art and academic studies.

As the waiting for college letters commenced, my son had his heart set on a conservatory program as his first choice. Keeping the door of possibilities open, I delicately broached the subject of my realization about him being innately grounded in the liberal arts. His way of thinking and his developing artistic process appeared to crave the interaction of many disciplines. He quietly listened and did not respond. I walked away hoping he was being reflective after my mini-lecture rather than politely ignoring. After many restless nights, after treading on eggshells around the subject, and after all letters were received, he chose to attend a liberal arts college rather than his original intention of a conservatory.

He shared with me that he worried this may pose some challenges for him in developing his technical processes; he was also concerned that he may not fit in. You see, I affectionately refer to him as our vagabond. He wanders, on foot, or bike, throughout the area we live in looking for opportunities to meet new people and draw fascinating things. Material possessions are low on his list of priorities. He lives each day as it comes. Will he fit into an environment that is not entirely filled with other young artists just like him? When will my internal tug-of-war end?

Why did he choose a liberal arts college? After many weeks of weighing the options, he decided that at a liberal arts college he would be exposed to many influences that allow for more subjective and contextual stimulation. His first choice was housed within a large university. An excellent program, no doubt; however, they were not keen on him double-majoring. His love of literature and anthropology needed to take a backseat and he wasn’t too sure he wanted that to happen. Now there is the opportunity for the other major of something solid.

He is currently mid-semester freshman year and finding himself questioning the true meaning of liberal arts. Although the college professes its liberal art values, he has found many students to be quite stagnant, fearful of exploration across disciplines. My son is bouncing back and forth with his second major (beyond an arts major) as being either English or anthropology. He has concluded that this decision would be based on what allows him the most room for artistic growth.

My son has given me a gift. His interdisciplinary way of thinking has provided me with an intellectual and artistic opportunity to further my development as a lifelong learning artist. Joining the forerunners in the dance field that recognize the potential of dance as an interdisciplinary art actively engages me in authentic learning and discussion which contributes to the core competencies that new generations of dancers should have. Robert Diamond documents these core competencies as communicating, problem solving, critical thinking, interpersonal skills, the appreciation of diversity and the ability to adapt to innovation and change.

The artistic process and creation, analytical thinking, and the integration of dance into other disciplines are foremost in my philosophy in the classroom and studio. I challenge my students and encourage them to explore all dance-related avenues of learning to broaden their perspectives of dance as an intellectual art form. As a motivated artist and educator I strive to work toward advancing my knowledge of the future of dance by continuing my education in an environment that promotes higher levels of standards for artistic education and research.

In the ‘80s, the movement and visual art worlds grew apart. Everyone was out for themselves trying to find monies to create. Shared venues between artists that encouraged dialogue among the arts became a thing of the past. Meanwhile, dance was trying to find a solo voice that was appreciated and viewed as a respected art form. My son is now entering an artistic world that has been enduring a tug of war with politics for the past nine years. He personally experienced this after working diligently on his portfolio submission to the Pennsylvania Governor’s School for the Arts. After waiting patiently for a response to his submission, he had the rug pulled out from under him. During the week the admission letters were supposed to be sent out, he was told by his school guidance counselor that funding for the school had been cut with the budget changes.

It is time to transfer into the 21st century and strongly merge artistic efforts with other artistic disciplines. Text, media, art; cross-discipline of art forms may open up more opportunities for funding in the 21st century. Dance is beginning to close the gap between the performance and the visual; to reintroduce itself to the other creative arts. Breaking down these disciplinary categories helps those looking for funding.

My son admitted to me that had he chosen an art conservatory, the study may have been too narrow. While a conservatory may have offered him more technical aspects necessary for a student artist, he has found that at a liberal arts college he is receiving the breadth that is necessary for artistic, creative and personal growth. His list of new friends spans the liberal arts academic choices. He can apply everything he is learning from this new environment to his art.

Having peeled back the parental layers to reach my artistic self I found a calming reassurance that my son will be just fine. How interesting that through this my son is the one that taught me the lesson. Yes, being an art major will open his eyes to the world in a way that he has not viewed it before. Yes, double-majoring with something “else” will give him an opportunity to merge his thoughts from discipline to discipline and communicate his new findings to the world. It is not hypocrisy. I am not leading my son or my students astray. I will watch my students grow, along with my son, as educated artists. He will be fine and will flourish as the interdisciplinary artist he is already becoming. It’s time to let go and let him experience. As he so delicately wrote me this past Mother’s Day, "through my individual growth, isolation, stubbornness, mistakes, choices, arguments, beliefs and lifestyle, which are all going to change faster than you can keep up, just know I love you."

Robin Gerchman
Author's email:

Robin Gerchman is assistant professor and director of dance at Cedar Crest College.

Photography and Political Violence

In Mark Twain’s bitter satire King Leopold’s Soliloquy (1905), the Belgian monarch recalls how much easier it was to control public opinion in the old days. Now all that anyone talks about are the atrocities in the Congo -- where the rubber and ivory trade have been very profitable for the king and his cronies, thanks to the absolute enslavement of the Congolese. “I have spent millions to keep the press of two hemispheres quiet,” he rants, “and still these leaks keep on occurring.”

His most vexing problem, it turns out, is a new and highly mobile bit of technology: “The Kodak has been a sore calamity to us. The most powerful enemy indeed…. The only witness I have encountered in my long experience I couldn’t bribe.” Photographs of mutilated Africans -- their hands cut off for the least infraction, and sometimes just for the hell of it -- were ruining Leopold’s good name as a humanitarian.

Trust that photojournalism gives reliable and virtually unmediated access to the truth has taken some hits over the intervening century. But in The Cruel Radiance: Photography and Political Violence (University of Chicago Press), Susie Linfield, director of the cultural reporting and criticism program at New York University, holds fast to Twain’s optimism about the power of images of suffering to create enormous moral and political effects. It was named a finalist in criticism for the National Book Critics Circle awards; my short essay on it appeared at the NBCC blog Critical Mass, which announced the winners in all categories last week.

We met briefly at the awards ceremony, and over the weekend Linfield responded to a series of questions. The following interview is drawn from that exchange.

Q: People used to write defenses of poetry. Your book opens with a defense of photography, and of photojournalism in particular -- particularly against certain strains of photography criticism. Is that really so urgent? Have polemics against photography ever had any effect on anyone? Susan Sontag's critique in On Photography may have been harsh, for example, but she collected photos, and kept on sitting for portraits.

A: Well, there are different kinds of urgency. I wouldn't say my defense of photojournalism -- and of photographic truth -- is as urgent as, say, stopping mass rape in the Congo, or as protecting Libyans from the madness of Qaddafi. But yes, I think that the attack on photojournalism -- Sontag was most prominent exponent of this, but the critique goes back to the Frankfurt School critics and forward to the postmodernists -- has given us too many alibis, too many excuses. It's very, very easy to simply not look at certain kinds of photographs, and therefore to not consider the phenomenological experience of certain kinds of violence. And, moreover, to feel virtuous in not-looking, since we've been told over and over that photographs exploit, manipulate, seduce, mislead, oppress, commodify... Even a teenager now can glibly tell you, "All photographs lie" or "There is no such thing as truth." But neither of those statements is accurate.

Q: You define your approach, not just against certain currents in photography criticism, but in continuity with other work -- James Agee's and Pauline Kael's writing on movies, for one, and Kenneth Tynan's on theater. Would you say more about that? And is there really no "usable past" in photography criticism itself you can draw on?

A: Yes, there is a wonderful "usable past" in photography criticism: including, certainly, Sontag, John Berger, Roland Barthes, Siegfried Kracauer, Walter Benjamin, and Brecht. The fact that I have criticisms of all these writers doesn't mean that I don't also think they've done invaluable, indeed brilliant, work. But what most photography critics lack (though Benjamin is actually an exception to this) is a passion for the form itself. And it is this passion for -- this cathection to -- the form that animates critics like Agee and Kael vis à vis the movies, and Tynan for the theater. It was also the animating force for the young critics who came of age in the mid-1960s and began writing about rock music: Ellen Willis, Greil Marcus, Robert Christgau, James Miller. Those music critics had read a lot of theory and history and criticism, and they were all highly analytic. But they also considered themselves part of the mass audience -- and of the larger counterculture -- in ways that many photography critics simply haven't. They weren't populists, but they were democrats, and -- like Kael -- they were highly invested in the question of what a democratic culture of excellence might look like.

In his book The Company of Critics, Michael Walzer argues for the importance of the organic critic: the critic who considers herself a part of the society that she critiques. He cites a wide range of examples, from the Hebrew prophets of the Old Testament to George Orwell and Antonio Gramsci. It is this kind of organic criticism that many photography critics scorn, or at least avoid. They start from a position of suspicion toward, not love for, photography -- and, sometimes, from a position of contempt for the general audience.

This is in part why the language of photography criticism -- I am thinking of the postmoderns now -- is often so clunky, even ugly. But to read Kael or Agee is a joy. They weren't writing about "the enemy," which is, alas, the stance of some photography critics. Look at Agee's reviews of Preston Sturges's "The Miracle of Morgan's Creek," or of Olivier's version of "Henry V," and you'll see what I mean.

Q: Can a photojournalistic image of atrocity have aesthetic interest? Should it? It would be one thing if Stuart Freedman's photo on page 146 -- showing a child in Sierra Leone sitting in an otherwise empty room, looking at his father's detached prosthetic limbs -- were the work of a surrealist artist. But to find it beautiful, as I did until reading the caption, seems pretty horrifying.

A: Yes, such pictures can -- and do -- have aesthetic interest, I think. There's no getting around that: photographs are aesthetic objects. They are a documentation of something; they are not the thing-in-itself. What makes photographs so bewildering, and so bothersome, and so discomfiting, is that they record something that actually happened, and at the time it actually happened (unlike other aesthetic objects, such as paintings and sculptures).

Lots of people hate the idea that photographs of violence and suffering can be beautiful -- and by beauty I mean aesthetically compelling. But of course they can be. So, for that matter, can literature, including nonfiction literature, that documents violence and cruelty (think of Primo Levi, though one can easily come up with many other examples). Is Paul Celan's "Deathfugue" a beautiful poem? It is, although the beauty is quite terrible.

I think that people often feel guilty looking at visually powerful, formally accomplished photographs of war and atrocity; hence the vitriolic critiques of Gilles Peress, James Nachtwey, and others. But the formal power of their photographs is, precisely, part of what allows them to convey the experience of suffering; and to convey it in ways that make me, at least, think harder and deeper about what they are showing. The guilt that some viewers feel when looking at these photographs is, I think, misplaced -- and rather narcissistic to boot.

And the truth of the matter is that even in the world's worst situations, beauty -- that is, visual power, grace, dignity, formal coherence -- exists. In 1944 -- a very bad year -- Czeslaw Milosz wrote a poem in which he said that the scent of a flowering tree "is like an insult/To suffering humanity..." And so it is. But I think we just have to live with this contradiction. The alternative -- to make messy, visually incoherent photographs -- makes no sense, and would do absolutely nothing for the victims.

Q: Sometimes photography does not simply document political violence but participates in it. The Cruel Radiance discusses several examples of this -- pictures of atrocity taken by Nazis, mug shots of Khmer Rouge prisoners taken at a torture center, and the digital snapshots from Abu Ghraib, among others. At one point you contrast the photojournalist's "ethics of showing" with the "ethics of seeing" incumbent upon viewers of images of political violence. But what are the terms of such an ethics of seeing when the act of taking a photo is meant to degrade and dehumanize?

A: I think these are the most difficult photographs to contemplate -- or to know how to contemplate. There is no doubt that there is there are times and circumstances when photography itself becomes as an act of cruelty: we see this with thousands of Nazi photographs, the Abu Ghraib photos, and many others. Among the most egregious contemporary examples are the many torture/beheading videos made by Islamic terrorist groups (the video of Daniel Pearl's murder is most famous, but there are, alas, many others).

There is no good way, or pure way, to look at such photos or videos or films. And I think everyone has their breaking point: for some it might be some of the Nazi photos, for others the beheading films. (I myself have never looked at the latter.) On the other hand, even the most horrific photos can be, and have been, used in ways their makers never intended. During World War II, for instance, the Polish Underground, Jewish partisans, and the Soviets flooded the Western media with photographs of Nazi atrocities that had been taken by Nazi soldiers; the anti-fascists wanted the world to know what was happening, and most of the documentation of Nazi barbarism came from the Nazis themselves. Alas, few of these photos were printed by Western newspapers at the time -- they were regarded as Jewish or Soviet "propaganda," and therefore as untrustworthy. But the point is that photographs can be used in ways their makers never intended. We can subvert the intent of the perpetrators.

A recent example of this is a series of four photographs taken last year by a Somali photographer for the AP named Farah Abdi Warsameh. They show, in gruesome detail, the stoning to death -- for the crime of adultery -- of a Somali man, by the Islamist militia Hizbul Islam. The photographs are very controversial: among other things, they could not possibly have been taken without the permission of Hizbul Islam. And I have no doubt that Hizbul Islam is circulating these photos -- which are truly disgusting -- with pride: they are propaganda of the deed. But I also have no doubt that Warsameh took them with other motives in mind (I've seen other examples of his work). And I think we should look at them, hard as that may be: they show what Shariah law looks like in practice. I should add that Shariah is now legal in Somalia -- which means that what we are looking at, up close, is "justice," Islamist-style.

Q: I have to question your formulation here. Treating Shariah law as some kind of homogeneously vicious thing is simply wrong -- there are reactionary forms of Shariah, and modernizing forms. Saying this is one way to get both Islamicists and Islamophobes mad at you, of course.

A: It's possible to have Shariah law that doesn't condone, or legalize, stonings. But I don't think there is such a thing as a truly modernized Shariah, because I don't believe the rule of law can be based in religious texts. (Ask women in Iran about this.) And the point is that, in the places where the introduction, or reintroduction, of Shariah is being debated (such as in Afghanistan, as part of a possible deal with the Taliban), the form that will be instituted won't be too modern, or permissive, or tolerant. Nor have I ever seen any form of Shariah that, in practice, does not discriminate against women.

My point about the Somali photos, though, is that: this is what Shariah looks like in practice -- or at least in too many practices -- and we should look at it. Debates about this are often rather theoretical, or based on "could be's" (as in, "Shariah could be modernized"). What we see here is not theoretical at all, nor is it a rare exception.

Q: Is there a particular image of political violence that you've found impossible to come to terms with -- to recover from viewing?

A: I'm not sure I've "come to terms" with any of the photographs in my book; I don't think they can be "mastered" (in much the way that Adorno wrote that Germans could not possibly "master" the reality of Auschwitz). For me, the hardest photographs are not those that actually depict violence, but those that depict its preview or aftermath: that show the victims before they were victims, or at least before they were dead victims.

There's a photograph in my book taken by Mendel Grossman, a Jewish photographer who was imprisoned in the Lodz Ghetto (he died on a death march at the very end of the war). It's an "underground" photo, i.e., taken surreptitiously. It shows two women kissing on the mouth -- their lips pressed together through a mesh fence -- before one of them is deported to a death camp. I have a lot of trouble recovering from that. Similarly, the photograph of the girl on the cover of my book -- a Cambodian child, photographed before execution (and probably torture) in a Khmer Rouge "prison" -- is very hard for me to look at, and very hard for me to look away from.

I feel that I owe her -- what? life, safety, salvation -- yet I am acutely aware that I can do exactly nothing. We look at her as she looks at us: but we are way too late. Even worse: when we were not too late, we did nothing. This is a very calm, serene, sober photograph -- with no overt violence whatsoever -- but it is a very powerful J'accuse.

Scott McLemee
Author's email:

Coming Together

Smart Title: 
Despite opposition, the 100-year old Atlanta College of Art will become part of the Savannah College of Art and Design.

Tearing Down a Gehry

Smart Title: 
UC Irvine decision isn't about aesthetics -- building by the noted architect is falling apart.

Jeffersonian Tradition or Shoddy Imitation?

Smart Title: 
Architecture professors at U. of Virginia charge that new campus buildings are boring, inefficient and insensitive.

First Amendment Lessons

Smart Title: 
Freedom of expression tests at Wisconsin-Green Bay (art), St. Lawrence University (Web site) and Baylor (coffee cups).


Subscribe to RSS - Arts
Back to Top