Cultural studies

Essay on the real death of the humanities

In all those years I was pursuing a Ph.D. in religious studies, the question of what my profession really stood for rarely came up in conversation with fellow academics, save for occasional moments when the position of the humanities in higher education came under criticism in public discourse. When such moments passed, it was again simply assumed that anyone entering a doctoral program in the humanities knowingly signed on to a traditional career of specialized research and teaching.

But the closer I got to receiving that doctorate, the less certain I became that this was a meaningful goal. I was surrounded by undergraduates who were rich, well-meaning, and largely apathetic to what I learned and taught. I saw my teachers and peers struggle against the tide of general indifference aimed at our discipline and succumb to unhappiness or cynicism. It was heartbreaking.

Fearing that I no longer knew why I studied religion or the humanities at large, I left sunny California for a teaching job at the Asian University for Women, in Chittagong, Bangladesh. My new students came from 12 different countries, and many of them had been brought up in deeply religious households, representing nearly all traditions practiced throughout Asia. They, however, knew about religion only what they had heard from priests, monks, or imams, and did not understand what it meant to study religion from an academic point of view. And that so many of them came from disadvantaged backgrounds convinced me that this position would give me a sense of purpose.

I arrived in Bangladesh prepared to teach an introductory course on the history of Asian religions. But what was meant to be a straightforward comparison of religious traditions around the region quickly slipped from my control and morphed into a terrible mess. I remember an early lesson: When I suggested during a class on religious pilgrimage that a visit to a Muslim saint’s shrine had the potential to constitute worship, it incited a near-riot.

Several Muslim students immediately protested that I was suggesting heresy, citing a Quranic injunction that only Allah should be revered. What I had intended was to point out how similar tension existed in Buddhism over circumambulation of a stupa — an earthen mound containing the relics of an eminent religious figure — since that act could be seen as both remembrance of the deceased’s worthy deeds and veneration of the person. But instead of provoking a thoughtful discussion, my idea of comparative religious studies seemed only to strike students as blasphemous.

Even more memorable, and comical in hindsight, was being urged by the same Muslims in my class to choose one version of Islam among all its sectarian and national variations and declare it the best. Whereas Palestinians pointed to the "bad Arabic" used in the signage of one local site as evidence of Islam’s degeneration in South Asia, a Pakistani would present Afghanis as misguided believers because — she claimed—they probably never read the entire Quran. While Bangladeshis counseled me to ignore Pakistanis from the minority Ismaili sect who claim that God is accessible through all religions, Bangladeshis themselves were ridiculed by other students for not knowing whether they were Sunni or Shi’a, two main branches of Islam. In the midst of all this I thought my call to accept these various manifestations of Islam as intriguing theological propositions went unheeded.

With my early enthusiasm and amusement depleted, I was ready to declare neutral instruction of religion in Bangladesh impossible. But over the course of the semester I could discern one positive effect of our classroom exercise: students’ increasing skepticism toward received wisdom. In becoming comfortable with challenging my explanations and debating competing religious ideas, students came to perceive any view toward religion as more an argument than an indisputable fact. They no longer accepted a truth claim at face value and analyzed its underlying logic in order to evaluate the merit of the argument. They expressed confidence in the notion that a religion could be understood in multiple ways. And all the more remarkable was their implicit decision over time to position themselves as rational thinkers and to define their religions for themselves.

An illustrative encounter took place at the shrine of the city’s most prominent Muslim saint. I, being a man, was the only one among our group to be allowed into the space. My students, the keeper of the door said, could be "impure" — menstruating — and were forbidden to enter. Instead of backing down as the local custom expected, the students ganged up on the sole guard and began a lengthy exposition on the meaning of female impurity in Islam. First they argued that a woman was impure only when she was menstruating and not at other times; they then invoked Allah as the sole witness to their cyclical impurity, a fact the guard could not be privy to and thus should not be able to use against them; and finally they made the case that if other Muslim countries left it up to individual women to decide whether to visit a mosque, it was not up to a Bangladeshi guard to create a different rule concerning entry. Besieged by a half-dozen self-styled female theologians of Islam, the man cowered, and withdrew his ban.

I was incredibly, indescribably proud of them.

Equally poignant was coming face to face with a student who asked me to interpret the will of Allah. Emanating the kind of glow only the truly faithful seem to possess, she sat herself down in my office, fixed the hijab around her round alabaster face, and quietly but measuredly confessed her crime: She had taken to praying at a Hindu temple because most local mosques did not have space for women, and she was both puzzled and elated that even in a non-Islamic space she could still sense the same divine presence she had been familiar with all her life as Allah. She asked for my guidance in resolving her crisis of faith. If other Muslims knew about her routine excursions to a Hindu temple, she would be branded an apostate, but did I think that her instinct was right, and that perhaps it was possible for Allah to communicate his existence through a temple belonging to another religion?

In the privacy of my office, I felt honored by her question. I had lectured on that very topic just before this meeting, arguing that sacred space was not the monopoly of any one religion, but could be seen as a construct contingent upon the presence of several key characteristics. This simple idea, which scholars often take for granted, had struck her as a novel but convincing explanation for her visceral experience of the Islamic divine inside a Hindu holy space. Though she had come asking for my approval of her newly found conviction, it was clear that she did not need anyone’s blessing to claim redemption. Humanistic learning had already provided her with a framework under which her religious experience could be made meaningful and righteous, regardless of what others might say.

And thanks to her and other students, I could at last define my own discipline with confidence I had until then lacked: The humanities is not just about disseminating facts or teaching interpretive skills or making a living; it is about taking a very public stance that above the specifics of widely divergent human ideas exist more important, universally applicable ideals of truth and freedom. In acknowledging this I was supremely grateful for the rare privilege I enjoyed as a teacher, having heard friends and colleagues elsewhere bemoan the difficulty of finding a meaningful career as humanists in a world constantly questioning the value of our discipline. I was humbled to be able to see, by moving to Bangladesh, that humanistic learning was not as dispensable as many charge.

But before I could fully savor the discovery that what I did actually mattered, my faith in the humanities was again put to a test when a major scandal befell my institution. I knew that as a member of this community I had to critique what was happening after all my posturing before students about the importance of seeking truth. If I remained silent, it would amount to a betrayal of my students and a discredit to my recent conclusion that humanistic endeavor is meant to make us not only better thinkers, but also more empowered and virtuous human beings.

So it was all the more crushing to be told to say nothing by the people in my very profession, whose purpose I thought I had finally ascertained. In private chats my friends and mentors in academe saw only the urgent need for me to extricate myself for the sake of my career, but had little to say about how to address the situation. Several of my colleagues on the faculty, though wonderful as individuals, demurred from taking a stance for fear of being targeted by the administration for retribution or losing the professional and financial benefits they enjoyed. And the worst blow, more so than the scandal itself, was consulting the one man I respected more than anybody else, a brilliant tenured scholar who chairs his own department at a research university in North America, and receiving this one-liner:

"My advice would be to leave it alone."

It was simultaneously flummoxing and devastating to hear a humanist say that when called to think about the real-life implications of our discipline, we should resort to inaction. And soon it enraged me that the same people who decry the dismantling of traditional academe under market pressure and changing attitudes toward higher education could be so indifferent, thereby silently but surely contributing to the collapse of humanists’ already tenuous legitimacy as public intellectuals.

While my kind did nothing of consequence, it was the students — the same students whom I had once dismissed as incapable of intellectual growth — who tried to speak up at the risk of jeopardizing the only educational opportunity they had. They approached the governing boards, the administration, and the faculty to hold an official dialogue. They considered staging a street protest. And finally, they gave up and succumbed to cynicism about higher education and the world, seeing many of their professors do nothing to live by the principles taught in class, and recognizing the humanities as exquisitely crafted words utterly devoid of substance.

As my feeling about my discipline shifted from profound grief to ecstatic revelation to acute disappointment, I was able to recall a sentiment expressed by one of my professors, who himself might not remember it after all these years. Once upon a time we sat sipping espresso on a verdant lawn not far from the main library, and he mused that he never understood why young people no longer seemed to feel outrage at the sight of injustice. He is a product of a generation that once once rampaged campuses and braved oppression by the Man. On first hearing his indictment, I was embarrassed to have failed the moral standard established by the older generation of scholars like him. But now I see that it is not just young people but much of our discipline, both young and old, that at present suffers from moral inertia. With only a few exceptions, humanists I know do not consider enactment of virtue to be their primary professional objective, whether because of the more important business of knowledge production or material exigencies of life. And I can only conclude, with no small amount of sadness, that most humanists are not, nor do they care to be, exemplary human beings.

Maybe I should move on, as did a friend and former academic who believes that the only people we can trust to stand on principle are "holy men, artists, poets, and hobos," because yes, it is true that humanists should not be confused with saints. But the humanities will always appear irrelevant as long as its practitioners refrain from demonstrating a tangible link between what they preach and how they behave. In light of the current academic penchant for blaming others for undoing the humanities, it must be said that humanists as a collective should look at themselves first, and feel shame that there is so much they can say — need to say — about the world, but that they say so little at their own expense.

After a year and a half in Bangladesh, I do not doubt any longer that the humanities matters, but now I know that the discipline’s raison d’être dies at the hands of those humanists who do not deserve their name.

Se-Woong Koo earned his Ph.D. in religious studies from Stanford University in 2011. He currently serves as Rice Family Foundation Visiting Fellow and Lecturer at Yale University.

Review of Michael Serazio, 'Your Ad Here: The Cool Sell of Guerrilla Marketing'

The most memorable thing about the 2002 science-fiction movie Minority Report was its depiction of advertising in a few decades -- in particular the scene of Tom Cruise hurrying through a mall, besieged by holographic, interactive invitations to have a Guinness or use American Express, and asking him how he liked the tank tops he’d purchased at the Gap. The virtual shills address him by name (the character’s name, that is) thanks to retinal scanners, which are as ubiquitous in the 2050s as surveillance cameras had become in the century’s first decade.

They are pop-up ads from hell, swarming like hungry ghosts to devour everyone’s attention. (The people Tom Cruise rushes past are presumably getting their own biometrically personalized shopping advice.) The scene feels uncomfortably plausible; it’s the experience of being on the Internet, extended into public space and rendered inescapable.

How effective the film is as social criticism probably depends on what you make of the fact that a quarter of its budget came from product placement. Minority Report’s critique of advertising turns out to be, in part, critique as advertising.

Now, I have some good news and some bad news. The good news is that people have become so resistant to hard-sell advertisement (dodging TV commercials with their DVRs, ignoring or mocking how ad agencies target their desires or insecurities) that they have lost influence. By the 2050s, our psychic calluses should be really thick.

The bad news concerns what is taking the place of the hard sell: a range of techniques discussed at some length in Your Ad Here: The Cool Sell of Guerrilla Advertising (New York University Press) by Michael Serazio, an assistant professor of communications at Fairfield University.

“Cool” advertising, as Serazio uses the expression, does not refer only to campaigns that make a product seem hip, hot, and happening -- so that you will be, too, by buying it. The adjective is instead a nod to the contrast between Marshall McLuhan’s famous if altogether dubious categorizations of “hot” media, such as film or print, and the “cool” sort, chiefly meaning television.

A hot medium, goes the theory, transmits its content in high resolution, so that the recipient easily absorbs it through a single sense. A cool medium, with its low resolution, demands greater involvement from the recipient in absorbing the message. Someone reading Aristotle or watching "Citizen Kane" is more or less passively taking in what the hot medium bombards the eye with, while the “Gilligan’s Island” audience finds its senses quickened (auditory and tactile in particular, according to McLuhan) by a need to compensate for the cool medium’s low level of visual stimulation.

That makes as much sense as any of the sage of Toronto’s other ideas, which is to say not a hell of a lot. Nonetheless, Serazio gets as much value out of the distinction as seems humanly possible by adapting it to the contrast between the old-school “hot” ad campaign – with its clear, strong message that you should buy Acme brand whatchamacallits, and here’s why – and a variety of newer, “cooler” approaches that are more seductive, self-effacing, or canny about dealing with widespread cynicism about corporate hype.

A cool ad campaign, when successful, does not simply persuade people to buy something but creates a kind of spontaneous, intimate involvement with the campaign itself.  The consumer’s agency is always stressed. ("Agency" in the sense of capacity to act, rather than where "Mad Men" do their business.) The Dorito’s "Fight for the Flavor" campaign of the mid-‘00s empowered the chip-gobbling public to determine which of two new flavors, Smokin' Cheddar BBQ or Wild White Nacho, would remain on the shelves and which would be pulled. Bloggers and tweeters are encouraged to express their authentic, unscripted enthusiasm. “Buzz agents” are given free samples of a product, chat it up with their friends, then report back how the discussions went. (With word-of-mouth campaigns, the most important is authenticity. Fake that and you’ve got it made.)

And at perhaps its most sophisticated level, cool advertising will cultivate the (potential) consumer’s involvement almost as an end in itself – for example, by providing an opportunity to control the behavior of a man in a chicken suit known as Subservient Chicken.

Let us return to the horrible fascination of Subservient Chicken in due course. But first, theory.

Foucault plus Gramsci equals about a third of the stuff published in cultural studies -- of which “critical industry media studies,” the subspecialty into which Serazio’s book falls, is a part. The conceptual work in Your Ad Here is done with Foucault’s line of power tools, in particular his considerations on governance, while Gramsci seems along mostly to keep him company.

Advertising as governance sounds counterintuitive, given the connotation of state power it elicits, but in Foucault’s work “government” refers to processes of guidance and control that may be more or less distant from the state’s institutions. The teacher governs a class (or tries) and a boss governs the workplace.

Over all, “management” seems like a more suitable term for most non-state modes of governance, and it has the advantage of foregrounding what Serazio wants to stress: Foucault’s point is that governance doesn’t mean giving orders and enforcing obedience but rather “structuring the possible field of action of others” in order “to arrange things in such a way that, through a certain number of means, such-and-such ends may be achieved.”

Governance (management) in this sense is a kind of effective persuasion of the governed party (the student, the fry cook, etc.) to exercise his or her agency to perform the necessary functions of the institution (school, fast-food place) without being subjected to constant external pressure. Insofar as governance is an art or a science, it is through recognizing and anticipating resistance, and preventing or containing disruption. (Some remarks by Gramsci on hegemony and resistance also apply here, but really just barely.)

“Cool sell” advertising counts as governance, in Serazio’s book, because it tries to neutralize public fatigue from advertisement overload -- so that we’re still incited to spend money and think well of a brand. That’s the common denominator of viral marketing, crowdsourced publicity campaigns, plebiscites on snack-food availability, and so on.

It occasionally sounds like Serazio is criticizing these methods as manipulative, but I suspect that’s actually high praise, like when one horror fan tells another that a torture scene in "Hostel" gave him nightmares.

Which brings us back, as promised, to Subservient Chicken, whose role in promoting the Burger King menu remains oblique at best. But he undeniably garnered an enormous amount of attention -- 20 million distinct viewers generating half a billion hits. “By filming hundreds of video clips of a man in a chicken suit,” the author says, “and writing code for a database of terms that would respond to keyword commands for the Chicken to perform those videotaped actions, [the advertising agency] concocted something that was, its own words, ‘so creepy, weird and well-executed that many people who visited… thought they were actually controlling this person in a chicken suit in real life.’ ” I can’t help feeling this calls for more extensive Foucauldian analysis, but I won’t be sticking around to see how that turns out.

 

Editorial Tags: 

review of 'Mad Men, Mad World: Sex, Politics, Style & the 1960s'

"Mad Men" returns to cable television this coming Sunday, continuing its saga of mutable identities and creative branding at a New York advertising firm during the 1960s. Or at least one assumes it will still be set in the ‘60s. How much narrative time lapses between seasons varies unpredictably. Like everything else about the show, it remains the network’s closely guarded secret. Critics given an early look at the program must agree to an embargo on anything they publish about it. This makes perfect sense in the context of the social world of "Mad Men" itself: the network is, after all, selling the audience’s curiosity to advertisers.

A different economy of attention operates in Mad Men, Mad World: Sex, Politics, Style & the 1960s, a collection of 18 essays on the program just published by Duke University Press. It’s not just a matter of the editors and contributors all being academics, hence presumably a different sort of cultural consumer from that of the average viewer. On the contrary, I think that is exactly wrong. Serialized narrative has to generate in its audience the desire for an answer to a single, crucial question: “And then what happens?” (Think of all the readers gathered at the docks in New York to get the latest installment of a Dickens novel coming from London.)

Of course, the contributors to Mad Men, Mad World write with a host of more complex questions in mind, but I don’t doubt for a second that many of the papers were initially inspired by weekend-long diegetic binge sessions, fueled by the same desire driving other viewers. At the same time, there’s every reason to think that the wider public is just as interested in the complex questions raised by the show as any of the professors writing about it. For they are questions are about race, class, gender, sexuality, politics, money, happiness, misery, and lifestyle – and about how much any configuration of these things can change, or fail to change, over time.   

Many of the essays serve as replies to a backlash against "Mad Men" that began in the third or fourth season, circa 2009, as it was beginning to draw a much larger audience than it had until that point. The complaint was that the show, despite its fanatical attention to the style, dress, and décor of the period, was simple-mindedly 21st century in its attitude toward the characters. It showed a world in which blunt expressions of racism, misogyny, and homophobia were normal, and sexual harassment in the workplace was an executive perk. Men wore hats and women stayed home.  Everyone smoked like a chimney and drank like a fish, often at the same time. Child abuse was casual. So was littering.

And because all of it was presented in tones by turn ironic and horrified, viewers were implicitly invited to congratulate themselves on how enlightened they were now. Another criticism held that "Mad Men" only seemed to criticize the oppressive arrangements it portrayed, while in reality allowing the viewer to enjoy them vicariously. These complaints sound contradictory: the show either moralistically condemns its characters or inspires the audience to wallow in political incorrectness. But they aren’t mutually exclusive by any means. What E.P. Thompson called “the enormous condescension of posterity” tends to be a default setting with Americans, alternating with periods of maudlin nostalgia. There’s no reason the audience couldn’t feel both about the "Mad Men" vision of the past.

See also a comment by the late Christopher Lasch, some 20 years ago: “Nostalgia is superficially loving in its re-creation of the past, but it invokes the past only to bury it alive. It shares with the belief in progress, to which it is only superficially opposed, an eagerness to proclaim the death of the past and to deny history’s hold on the present.”

At the risk of conflating too many arguments under too narrow a heading, I’d say that the contributors to Mad Men, Mad World agree with Lasch’s assessment of progress and nostalgia while also demonstrating how little it applies to the program as a whole.

Caroline Levine’s essay “The Shock of the Banal: Mad Men's Progressive Realism” provides an especially apt description of how the show works to create a distinct relationship between past and present that’s neither simply nostalgic nor a celebration of how far we’ve come. The dynamic of "Mad Men" is, in her terms, “the play of familiarity in strangeness” that comes from seeing “our everyday assumptions just far enough removed from us to feel distant.” (Levine is a professor of English at the University of Wisconsin at Madison.)

The infamous Draper family picnic in season two is a case in point. After a pleasant afternoon with the kids in a bucolic setting, the parents pack up their gear, shake all the garbage off their picnic blanket, and drive off. The scene is funny, in the way appalling behavior can sometimes be, but it’s also disturbing. The actions are so natural and careless – so thoughtless, all across the board – that you recognize them immediately as habit. Today’s viewers might congratulate themselves for at least feeling guilty when they litter. But that’s not the only possible response, because the scene creates an uneasy awareness that once-familiar, “normal” ideas and actions came to be completely unacceptable – within, in fact, a relatively short time. It eventually became the butt of jokes, but the famous “Keep America Beautiful” ad from about 1970 -- the one with the crying Indian -- probably had a lot to do with it. (Such is the power of advertising.)

The show's handling of race and gender can be intriguing and frustrating. All the powerful people in it are straight white guys in ties, sublimely oblivious to even the possibility that their word might not be law. "Mad Space" by Dianne Harris, a professor of architecture and art history at the University of Illinois at Urbana-Champaign, offers a useful cognitive map of the show's world -- highlighting how the advertising firm's offices are organized to demonstrate and reinforce the power of the executives over access to the female employees' labor (and, often enough, bodies), while the staid home that Don Draper and his family occupy in the suburbs is tightly linked to the upper-middle-class WASP identity he is trying to create for himself by concealing and obliterating his rural, "white trash" origins. A handful of African-American characters appear on the margins of various storylines -- and one, the Drapers' housekeeper Carla, occupies the especially complex and fraught position best summed up in the phrase "almost part of the family." But we never see the private lives of any nonwhite character.

In "Representing the Mad Margins of the Early 1960s: Northern Civil Rights and the Blues Idiom," Clarence Lang, an associate professor of African and African-American studies at the University of Kansas, writes that "Mad Men" "indulges in a selective forgetfulness" by "presuming a black Northern quietude that did not exist" (in contrast to the show's occasional references to the civil rights movement below the Mason-Dixon line). Lang's judgment here is valid -- up to a point. As it happens, all of the essays in the collection were written before the start of the fifth season, in which black activists demonstrate outside the firm's building to protest the lack of job opportunities. Sterling Cooper Draper Pryce hires its first African-American employee, a secretary named Dawn. I think a compelling reading of "Mad Men" would recognize that the pace and extent of the appearance of nonwhite characters on screen is a matter not of the creators' refusal to portray them, but of their slow arrival on the scene of an incredibly exclusionary social world being transformed (gradually and never thoroughly) by the times in which "Mad Men" is set.   

There is much else in the book that I found interesting and useful in thinking about "Mad Men," and I think it will be stimulating to readers outside the ranks of aca fandom. I’ll return to it in a few weeks, with an eye to connecting some of the essays to new developments at Sterling Cooper Draper Pryce. (Presumably the firm will have changed its name in the new season, given the tragic aftermath of Lane Pryce’s venture in creative bookkeeping.)

When things left off, it was the summer of 1967. I have no better idea than any one else when or how the narrative will pick up, but really hope that Don Draper creates the ad campaign for Richard Nixon.

 

Editorial Tags: 

NHA speakers implore humanities scholars to fight for their fields

Advocates for the humanities search for the arguments to win federal support, and to stop having their disciplines treated "like a piñata."

Putting the black studies debate into perspective (essay)

Intellectual Affairs

For a week now, friends have been sending me links from a heated exchange over the status and value of black studies. It started among bloggers, then spilled over into Twitter, which always makes things better. I'm not going to rehash the debate, which, after all, is always the same. As with any other field, black studies (or African-American studies, or, in the most cosmopolitan variant, Africana studies) could only benefit from serious, tough-minded, and ruthlessly intelligent critique. I would be glad to live to see that happen.

But maybe the rancor will create some new readers for a book published five years ago, From Black Power to Black Studies: How a Radical Social Movement Became an Academic Discipline (Johns Hopkins University Press) by Fabio Rojas, an associate professor of sociology at Indiana University. Someone glancing at the cover in a bookstore might take the subtitle to mean it's another one of those denunciations of academia as a vast liberal-fascist indoctrination camp for recruits to the New World Order Gestapo. I don't know whether that was the sales department's idea; if so, it was worth a shot. Anyway, there the resemblance ends. Rojas wrote an intelligent, informed treatment of black studies, looking at it through the lens of sociological analysis of organizational development, and with luck the anti-black-studies diatribalists will read it by mistake and accidentally learn something about the field they are so keen to destroy. (Spell-check insists that “diatribalists” is not a word, but it ought to be.)

Black studies was undeniably a product of radical activism in the late 1960s and early ‘70s. Administrators established courses only as a concession to student protesters who had a strongly politicized notion of the field’s purpose. “From 1969 to 1974,” Rojas writes, “approximately 120 degree programs were created,” along with “dozens of other black studies units, such as research centers and nondegree programs,” plus professional organizations and journals devoted to the field.

But to regard black studies as a matter of academe becoming politicized (as though the earlier state of comprehensive neglect wasn’t politicized) misses the other side of the process: “The growth of black studies,” Rojas suggests, “can be fruitfully viewed as a bureaucratic response to a social movement.” By the late 1970s, the African-American sociologist St. Clair Drake (co-author of Black Metropolis, a classic study of Chicago to which Richard Wright contributed an introduction) was writing that black studies had become institutionalized “in the sense that it had moved from the conflict phase into adjustment to the existing educational system, with some of its values accepted by that system…. A trade-off was involved. Black studies became depoliticized and deradicalized.”

That, too, is something of an overstatement -- but it is far closer to the truth than denunciations of black-studies programs, which treat them as politically volatile, yet also as well-entrenched bastions of power and privilege. As of 2007, only about 9 percent of four-year colleges and universities had a black studies unit, few of them with a graduate program. Rojas estimates that “the average black studies program employs only seven professors, many of whom are courtesy or joint appointments with limited involvement in the program” -- while in some cases a program is run by “a single professor who organizes cross-listed courses taught by professors with appointments in other departments.”

The field “has extremely porous boundaries,” with scholars who have been trained in fields “from history to religious studies to food science.” Rojas found from a survey that 88 percent of black studies instructors had doctoral degrees. Those who didn’t “are often writers, artists, and musicians who have secured a position teaching their art within a department of black studies.”

As for faculty working primarily or exclusively in black studies, Rojas writes that “the entire population of tenured and tenure-track black studies professors -- 855 individuals -- is smaller than the full-time faculty of my own institution.” In short, black studies is both a small part of higher education in the United States and a field connected by countless threads to other forms of scholarship. The impetus for its creation came from African-American social and political movements. But its continued existence and development has meant adaptation to, and hybridization with, modes of enquiry from long-established disciplines.

Such interdisciplinary research and teaching is necessary and justified because (what I am about to say will be very bold and very controversial, and you may wish to sit down before reading further) it is impossible to understand American life, or modernity itself, without a deep engagement with African-American history, music, literature, institutions, folklore, political movements, etc.

In a nice bit of paradox, that is why C.L.R. James was so dubious about black studies when it began in the 1960s. As author of The Black Jacobins and The History of Negro Revolt, among other classic works, he was one of the figures students wanted to be made visiting professor when they demanded black studies courses. But when he accepted, it was only with ambivalence. "I do not believe that there is any such thing as black studies," he told an audience in 1969. "...I only know, the struggle of people against tyranny and oppression in a certain social setting, and, particularly, the last two hundred years. It's impossible for me to separate black studies and white studies in any theoretical point of view."

Clearly James's perspective has nothing in common with the usual denunciations of the field. The notion that black studies is just some kind of reverse-racist victimology, rigged up to provide employment for "kill whitey" demagogues, is the product of malice. But it also expresses a certain banality of mind -- not an inability to learn, but a refusal to do so. For some people, pride in knowing nothing about a subject will always suffice as proof that it must be worthless.

Review of Orin Starn, "The Passion of Tiger Woods"

Intellectual Affairs

On the Friday following Thanksgiving in 2009, Tiger Woods had an automobile accident. For someone who does not follow golf, the headlines that ran that weekend provided exactly as much information as it seemed necessary to have. Over the following week, I noticed a few more headlines, but they made no impression. Some part of the brain is charged with the task of filtering the torrent of signals that bombard it from the media every day. And it did its job with reasonable efficiency, at least for a while.

Some sort of frenzy was underway. It became impossible to tune this out entirely. I began to ignore it in a more deliberate way. (All due respect to the man for his talent and accomplishments, but the doings of Tiger Woods were exactly as interesting to me as mine would be to him.) There should be a word for the effort to avoid giving any attention to some kerfuffle underway in the media environment. “Fortified indifference,” perhaps. It’s like gritting your teeth, except with neurons.

But the important thing about my struggle in 2009 is that it failed. Within six weeks of the accident, I had a rough sense of the whole drama in spite of having never read a single article on the scandal, nor watched nor listened to any news broadcasts about it. The jokes, allusions, and analogies spinning off from the event made certain details inescapable. A kind of cultural saturation had occurred. Resistance was futile. The whole experience was irritating, even a little depressing, for it revealed the limits of personal autonomy in the face of an unrelenting media system, capable of imposing utterly meaningless crap on everybody’s attention, one way or another.

But perhaps that’s looking at things the wrong way. Consider the perspective offered by Orin Starn in The Passion of Tiger Woods: An Anthropologist Reports on Golf, Race, and Celebrity Scandal (Duke University Press). Starn, the chair of cultural anthropology at Duke, maintains that the events of two years back were not meaningless at all. If anything, they were supercharged with cultural significance.

The book's title alludes to the theatrical reenactments of Christ’s suffering performed at Easter during the middle ages, or at least to Mel Gibson’s big-screen rendition thereof. Starn interprets “Tigergate” as an early 21st-century version of the scapegoating rituals analyzed by René Girard. From what I recall of Girardian theory, the reconsolidation of social order involves the scapegoat being slaughtered, rather than paying alimony, though in some cases that may be too fine a distinction.

The scandal was certainly louder and more frenetic than the game that Woods seems have been destined to master. The first image of him in the book shows him at the age of two, appearing on "The Mike Douglas Show" with his father. He is dressed in pint-sized golfing garb, with a little bag of clubs over his shoulder. As with a very young Michael Jackson, the performance of cuteness now reads as a bit creepy. Starn does not make the comparison, but it’s implicit, given the outcome. “This toddler was not to be one of those child prodigies who flames out under unbearable expectations,” Starn writes. “By his early thirties, he was a one-man multinational company…. Forbes magazine heralded Woods as the first athlete to earn $1 billion.”

Starn, who mentions that he is a golfer, is also a scholar of the game, which he says “has always traced the fault lines of conflict, hierarchy, and tension in America, among them the archetypal divides of race and class.” To judge by my friend Dave Zirin’s book A People’s History of Sports in the United States (The New Press) that’s true of almost any athletic pursuit, even bowling. But the salient point about Woods is that most of his career has been conducted as if no such fault lines existed. Starn presents some interesting and little-known information on how golf was integrated. But apart from his genius on the green, Woods’s “brand” has been defined by its promise of harmony: “He and his blonde-haired, blue-eyed wife, Elin Nordegren, seemed the poster couple for a shiny new postracial America with their two young children, two dogs, and the fabulous riches of Tiger’s golfing empire.”

Each of his parents had a multiracial background -- black, white, and Native American on his father’s side; Chinese, Thai, and Dutch on his mother’s. “Cablinasian,” the label Woods made up to name his blended identity, is tongue-in-cheek, but it also represents a very American tendency to mess with the established categories of racial identity by creating an ironic mask. (Ralph Ellison wrote about in his essay “Change the Joke and Slip the Yoke.”)

But that mask flew off, so to speak, when his car hit the fire hydrant in late 2009. Starn fills out his chronicle of the scandal that followed with an examination of the conversation and vituperation that took place online, often in the comments sections of news articles -- with numerous representative samples, in all their epithet-spewing, semiliterate glory. The one-drop rule remains in full effect, it seems, even for Cablinasians.

“For all the ostensible variety of opinion,” Stern writes about the cyberchatter, “there was something limited and predictable about the complaints, stereotypes, and arguments and counterarguments, as if we were watching a movie we’d already seen many times before. Whether [coming from] the black woman aggrieved with Tiger about being with white women or the white man bitter about supposed black privilege, we already knew the lines, or at least most of them.… We are all players, like it or not, in a modern American kabuki theater of race, where our masks too often seem to be frozen into a limited set of expressions.”

Same as it ever was, then. But this is where the comparison to a scapegoating ritual falls apart. (Not that it’s developed very much in any case.) At least in Girard’s analysis, the ritual is an effort to channel and purge the conflicts within a society – reducing its tensions, restoring its sense of cohesion and unity, displacing the potential for violence by administering a homeopathic dose of it. Nothing like that can be said to have happened with Tigergate. It involved no catharsis. For that matter, it ran -- by Starn’s own account -- in exactly the opposite direction: the golfer himself symbolized harmony and success and the vision of historical violence transcended with all the sublime perfection of a hole-in-one. The furor of late 2009 negated all of that. The roar was so load that it couldn’t be ignored, even if you plugged your ears and looked away.

The latest headlines indicate that Tiger Woods is going to play the Pebble Beach Pro-Am tournament next month, for the first time in a decade. Meanwhile, his ex-wife has purchased a mansion for $12 million and is going to tear it down. She is doing so because of termites, or so go the reports. Hard to tell what symbolic significance that may have. But under the circumstances, wiping out termites might not be her primary motivation for destroying something incredibly expensive.

Ryan Gosling pick-up line meme reaches academe

Section: 

Satirical blogs explore whether a Hollywood sex symbol can make academic pick-up lines seem smooth.

Not So Foreign Languages

Citing demographic and pedagogic trends, growing number of colleges rename departments "world" or "modern" languages.

John Hughes's Lessons

"Bueller. Bueller. Bueller."

As an untenured professor I live in constant dread that my voice will (like Ben Stein's in "Ferris Bueller's Day Off") morph into an endless monotone that will meet an equally endless silence, and that things will get so desperate that only a choreographed rendition of “Twist and Shout” during a German American Day parade in Chicago will shake me and my students out of our stupor.

As the generational distance between me and my students grows (they’ve probably only seen these Gen-X defining scenes) on DVD or YouTube, if at all), it seems as if Bueller moments are unavoidable.

But for all of the examples of generational disconnect in the movies of the late director John Hughes -- particularly those produced when my junior colleagues and I came of age in the mid-1980s -- Hughes (who died this month) also offers cues for avoiding the Bueller Triangle where meaningful interaction among adults and youth simply vanishes. In this light, Hughes’s films are revelatory for educators.

For example, “Ferris Bueller’s Day Off” affirms the pedagogical strategies of effective teachers. Students want to take ownership of their learning. Like Ferris, they don’t want to be passive receptors of information but active creators of meaningful knowledge.

They don’t just want to study the historical, economic, political, psycho-sexual, and post-colonial contours of the red Ferrari. They want to drive it. We’ve got to enable them to go where their passions and curiosities lead them, and learn to teach them the significance of our “ologies” and “isms” from the passenger’s seat.

Living up to expectations landed the popular girl, the weirdo, the geek, the jock and the rebel in “The Breakfast Club.” Ironically, Saturday morning detention provided safe space for conversation without which these otherwise disparate characters would not have discovered the right blend of commonality and individuality needed to resist life-threatening pressures.

Professors who provide safe spaces in and outside of the classroom for discerning conversation successfully bridge the gap between our expectations of students, and students’ expectations of us. Free of ridicule and judgment students are liberated to ask themselves the eternal question on the road to adulthood: “Who do I want to become?” For further reading, see “She’s Having a Baby.”

“That’s why they call them crushes,” Samantha Baker’s dad explains in a rare Hughes moment of adult clarity and compassion in “Sixteen Candles.” “If they were easy they’d call them something else.” More than just re-telling a tale of teenage crushes, Hughes illuminates the struggle for authenticity when it comes to romance, dating and sex. What was glaringly absent in 1984 is also missing today, especially in the collegiate “hook up” culture. We need more open-minded adults willing to listen to students before pragmatically proposing a list of dos and don’ts.

And adults like Andy Walsh’s broken-hearted father, Jack, or her eclectic boss, Iona, in “Pretty in Pink,” who teach young people by demonstrating what learning looks like -- neither relating to them as peers nor hovering to try to protect them from life’s inevitable failures -- provide the materials students need to make their own prom gowns, a now classic metaphor for navigating the drama of adolescence.

How many times did Hughes depict the power of privilege and the misuse of teenage social capital? Millennials have to navigate social differences, many of which may be more divisive than they were 20 years ago in “Some Kind of Wonderful” because they are more subtle. While it is true that we “can’t tell a book by it’s cover,” to quote the protagonist Keith Nelson, relational power plays continue, to use Watts’s retort, to reveal “how much it’s gonna cost you.”

Taking responsibility for privilege so that we might use it wisely involves understanding and owning our particular contexts rather than simply rejecting them. In fact, Hughes’s films provide ample fodder for unpacking Peggy McIntosh’s “invisible knapsack of privilege,” given his preference for white suburbia and demeaning portrayals of ethnic minorities.

So if we don’t want to forget about Hughes we should not only reminisce about the way his characters spoke directly to our various adolescent selves. We might also remember how not to behave as adults when it comes to engaging our successors.

After all, we’re no longer writing the papers for Mr. Bender in detention. We’re grading them.

Author/s: 
Maureen O'Connell
Author's email: 
newsroom@insidehighered.com

***

Maureen H. O’Connell is an assistant professor of theology at Fordham University and a participant in the 2009-10 pre-tenure teaching workshop at Wabash College's Wabash Center for Teaching and Learning in Theology and Religion.

Major Move Ahead?

Cal State Northridge could become the first college in the country to offer a Central American studies major.

Pages

Subscribe to RSS - Cultural studies
Back to Top