Cultural studies

Round-up on lobbying over American studies vote on Israel boycott

Section: 
Smart Title: 

Round-up of statements, charges and counter-charges as American Studies Association prepares to finish vote on whether to back boycott of Israeli universities.

Review of Karine Nahon and Jeff Hemsley "Going Viral" and Limor Shifman, "Memes in Digital Culture"

Intellectual Affairs

For some reason I have become aware that it is possible to take photographs of bass guitar players in mid-performance and, by digital means, to replace their instruments with dogs, so that it then appears the musicians (who very often wear a facial expressions suggesting rapture or deep concentration) are tickling the dogs. Yes, yes it is.

I am not proud of this knowledge and did not seek it out, and would have forgotten about it almost immediately if not for something else occupying my attention in the past few days: a couple of new books treating the phenomenon with great and methodical seriousness. Not, of course, the dog-tickling bass player phenomenon as such, but rather, the kind of online artifact indicated by the titles of Karine Nahon and Jeff Hemsley’s Going Viral (Polity) and Limor Shifman’s Memes in Digital Culture (MIT Press).

The authors differentiate between the topics of the two volumes. Despite a common tendency to equate them, memes don’t always “go viral.” Things that do (say, video shot during a typhoon, uploaded while the disaster is still under way) are not always memes. The distinction will be clarified shortly -- and there is indeed some value in defining the contrast. It corresponds to different kinds of behavior or, if you prefer, different ways of mediating social and cultural life by means of our all-but-inescapable digital device.

Still, the line should be drawn only just so sharply. It seems bright and clear when the authors bring their different methods (one more quantitative than qualitative and vice versa) to the job. I don’t mean that the difference between viral and memetic communication is simply one of perspective. It seems to exist in real life. But so does their tendency to blur.

“Virality,” write Nahon and Hemsley in a definition unlikely to be improved upon, “is a social information flow process where many people simultaneously forward a specific information item, over a short period of time, within their social networks, and where the message spreads beyond their own (social) networks to different, often distant networks, resulting in a sharp acceleration in the number of people who are exposed to the message.” (Nahon is an associate professor, and Hemsley a Ph.D. candidate, at the Information School of the University of Washington.

Here the term “information item” is used very broadly, to cover just about any packet of bytes: texts, photographs, video, sound files, etc. It also includes links taking you to such material. But unlike a computer virus -- an unwanted, often destructive such packet – a message that has “gone viral” doesn’t just forward itself. It propagates through numerous, dispersed, and repeated decisions to pay attention to something and then circulate it.

The process has a shape. Charting on a graph the number of times a message is forwarded over time, we find that the curve for a news item appearing at a site with a great deal of traffic (or a movie trailer advertised on a number of sites) shoots up at high speed, then falls just about as rapidly. The arc is rapid and smooth.

By contrast, the curve for an item going viral is a bit more drawn-out -- and a lot rougher. It may show little or no motion for a while before starting to trend upwards for a while (possibly followed by a plateau or downturn or two) until reaching a certain point at which the acceleration becomes extremely sharp, heading to a peak, whereupon the number of forwards begins to fall off, more or less rapidly -- with an occasional bounce upwards perhaps, but nothing so dramatic as before.

So the prominently featured news item or blockbuster ad campaign on YouTube shoots straight up, like a model rocket on a windless day, until the fuel (newsworthiness, dollars) runs out, whereupon it stops, then begins to accelerate in the opposite direction. But when something goes viral, more vectors are involved. It circulates within and between clusters of people -- individuals with strong mutual connections with each other. It circulates through the networks, formal or informal, in which those clusters are embedded.

And from there, onward and outward – whether with a push (when somebody with a million Twitter followers takes notice), or a pull (it begins to rank among top search-engine results on a certain topic), or both. The authors itemize factors in play in decisions about whether or not to share something: salience, emotional response, congruence with the person’s values, etc. And their definition of virality as “a social information flow process” takes into account both the horizontal dimension of exchange (material circulating spontaneously among people familiar with one another) and the roles of filtering and broadcasting exercised by individuals and online venues with a lot of social capital.

None of which makes virality something that can be planned, however. “Content that we create can remain stubbornly obscure even when we apply our best efforts to promote it,” they write. “It can also grow and spread with an apparent life and momentum of its own, destroying some people’s lives and bringing fame and fortune to others, sometimes in a matter of days.”

An Internet meme, as Limor Shifman sums things up, is “(a) a group of digital items sharing common characteristics of content, form, and/or stance; (b) that were created with awareness of each other; and (c) were circulated, imitated, and/or transformed via the Internet by many users.”

Shifman (a senior lecturer in communication and journalism at Hebrew University of Jerusalem) calls it “the best concept to encapsulate some of the most fundamental aspects of the Internet in general,” and especially now, when the tools for creating and modifying digital content are so readily available. The sort of cartoon video exemplified by “Reading and Time: A dialectic between academic expectation and academic frustration” also comes to mind, as does the Pepper Spraying Cop meme, now something of a classic.

As with virality, the concept rests on a biological metaphor. Coined by Richard Dawkins in 1976, “meme” began in a quasi-scientific effort to identify the gene-like elements of behavior, cultural patterns, and belief systems that caused them to persist, expand, and reproduce themselves over very long periods of time. As reincarnated within cyberculture, the meme is a thing of slighter consequence: a matter of endless variation on extremely tenacious inside jokes, occupying and replicating within the brains of bored people in offices.

Shifman's point that memetic communication (which for the most part involves mimicry of existing digital artifacts with parodic intent and/or "remixing" them with new content) is an exemplary case of Web 2.0 culture seems to me sound, which probably also explains why much in the book may seem familiar even to someone not up on LOLcats studies. Yes, memes are a form of active participation in digital communication. Yes, they can carry content that (whether the meme goes viral or not) questions or challenges existing power structures. I have seen my share of Downfall parody videos, and am glad to know that Bruno Gantz is okay with the whole thing. But every so often that line from Thoreau comes to mind -- "as if we could kill time without injuring eternity" -- and it seems like a good idea to go off the grid for a while.

 

Editorial Tags: 

Essay on what's missing in discussion of the humanities

Over the last year there has been a steady stream of articles about the “crisis in the humanities,” fostering a sense that students are stampeding from liberal education toward more vocationally oriented studies. In fact, the decline in humanities enrollments, as some have pointed out, is wildly overstated, and much of that decline occurred in the 1970s and 1980s. Still, the press is filled with tales about parents riding herd on their offspring lest they be attracted to literature or history rather than to courses that teach them to develop new apps for the next, smarter phone.

America has long been ambivalent about learning for its own sake, at times investing heavily in free inquiry and lifelong learning, and at other times worrying that we need more specialized training to be economically competitive. A century ago these worries were intense, and then, as now, pundits talked about a flight from the humanities toward the hard sciences.

Liberal education was a core American value in the first half of the 20th century, but a value under enormous pressure from demographic expansion and the development of more consistent public schooling. The increase in the population considering postsecondary education was dramatic. In 1910 only 9 percent of students received a high school diploma; by 1940 it was 50 percent. For the great majority of those who went on to college, that education would be primarily vocational, whether in agriculture, business, or the mechanical arts. But even vocationally oriented programs usually included a liberal curriculum -- a curriculum that would provide an educational base on which one could continue to learn -- rather than just skills for the next job. Still, there were some then (as now) who worried that the lower classes were getting “too much education.”

Within the academy, between the World Wars, the sciences assumed greater and greater importance. Discoveries in physics, chemistry, and biology did not seem to depend on the moral, political, or cultural education of the researchers – specialization seemed to trump broad humanistic learning. These discoveries had a powerful impact on industry, the military, and health care; they created jobs! Specialized scientific research at universities produced tangible results, and its methodologies – especially rigorous experimentation – could be exported to transform private industry and the public sphere. Science was seen to be racing into the future, and some questioned whether the traditional ideas of liberal learning were merely archaic vestiges of a mode of education that should be left behind.

In reaction to this ascendancy of the sciences, many literature departments reimagined themselves as realms of value and heightened subjectivity, as opposed to so-called value-free, objective work. These “new humanists” of the 1920s portrayed the study of literature as an antidote to the spiritual vacuum left by hyperspecialization. They saw the study of literature as leading to a greater appreciation of cultural significance and a personal search for meaning, and these notions quickly spilled over into other areas of humanistic study. Historians and philosophers emphasized the synthetic dimensions of their endeavors, pointing out how they were able to bring ideas and facts together to help students create meaning. And arts instruction was reimagined as part of the development of a student’s ability to explore great works that expressed the highest values of a civilization. Artists were brought to campuses to inspire students rather than to teach them the nuances of their craft. During this interwar period a liberal education surely included the sciences, but many educators insisted that it not be reduced to them. The critical development of values and meaning was a core function of education.

Thus, despite the pressures of social change and of the compelling results of specialized scientific research, there remained strong support for the notion that liberal education and learning for its own sake were essential for an educated citizenry. And rather than restrict a nonvocational education to established elites, many saw this broad teaching as a vehicle for ensuring commonality in a country of immigrants. Free inquiry would model basic democratic values, and young people would be socialized to American civil society by learning to think for themselves.

By the 1930s, an era in which ideological indoctrination and fanaticism were recognized as antithetical to American civil society, liberal education was acclaimed as key to the development of free citizens. Totalitarian regimes embraced technological development, but they could not tolerate the free discussion that led to a critical appraisal of civic values. Here is the president of Harvard, James Bryant Conant, speaking to undergraduates just two years after Hitler had come to power in Germany:

To my mind, one of the most important aspects of a college education is that it provides a vigorous stimulus to independent thinking.... The desire to know more about the different sides of a question, a craving to understand something of the opinions of other peoples and other times mark the educated man. Education should not put the mind in a straitjacket of conventional formulas but should provide it with the nourishment on which it may unceasingly expand and grow. Think for yourselves! Absorb knowledge wherever possible and listen to the opinions of those more experienced than yourself, but don’t let any one do your thinking for you.

This was the 1930s version of liberal learning, and in it you can hear echoes of Thomas Jefferson’s idea of autonomy and Ralph Waldo Emerson’s thoughts on self-reliance.

In the interwar period the emphasis on science did not, in fact, lead to a rejection of broad humanistic education. Science was a facet of this education. Today, we must not let our embrace of STEM fields undermine our well-founded faith in the capacity of the humanities to help us resist “the straitjackets of conventional formulas.” Our independence, our freedom, has depended on not letting anyone else do our thinking for us. And that has demanded learning for its own sake; it has demanded a liberal education. It still does.

Michael Roth is president of Wesleyan University. His new book, Beyond the University: Why Liberal Education Matters, will be published next year by Yale University Press. His Twitter handle is @mroth78

Image Source: 
Getty Images

Essay on the real death of the humanities

In all those years I was pursuing a Ph.D. in religious studies, the question of what my profession really stood for rarely came up in conversation with fellow academics, save for occasional moments when the position of the humanities in higher education came under criticism in public discourse. When such moments passed, it was again simply assumed that anyone entering a doctoral program in the humanities knowingly signed on to a traditional career of specialized research and teaching.

But the closer I got to receiving that doctorate, the less certain I became that this was a meaningful goal. I was surrounded by undergraduates who were rich, well-meaning, and largely apathetic to what I learned and taught. I saw my teachers and peers struggle against the tide of general indifference aimed at our discipline and succumb to unhappiness or cynicism. It was heartbreaking.

Fearing that I no longer knew why I studied religion or the humanities at large, I left sunny California for a teaching job at the Asian University for Women, in Chittagong, Bangladesh. My new students came from 12 different countries, and many of them had been brought up in deeply religious households, representing nearly all traditions practiced throughout Asia. They, however, knew about religion only what they had heard from priests, monks, or imams, and did not understand what it meant to study religion from an academic point of view. And that so many of them came from disadvantaged backgrounds convinced me that this position would give me a sense of purpose.

I arrived in Bangladesh prepared to teach an introductory course on the history of Asian religions. But what was meant to be a straightforward comparison of religious traditions around the region quickly slipped from my control and morphed into a terrible mess. I remember an early lesson: When I suggested during a class on religious pilgrimage that a visit to a Muslim saint’s shrine had the potential to constitute worship, it incited a near-riot.

Several Muslim students immediately protested that I was suggesting heresy, citing a Quranic injunction that only Allah should be revered. What I had intended was to point out how similar tension existed in Buddhism over circumambulation of a stupa — an earthen mound containing the relics of an eminent religious figure — since that act could be seen as both remembrance of the deceased’s worthy deeds and veneration of the person. But instead of provoking a thoughtful discussion, my idea of comparative religious studies seemed only to strike students as blasphemous.

Even more memorable, and comical in hindsight, was being urged by the same Muslims in my class to choose one version of Islam among all its sectarian and national variations and declare it the best. Whereas Palestinians pointed to the "bad Arabic" used in the signage of one local site as evidence of Islam’s degeneration in South Asia, a Pakistani would present Afghanis as misguided believers because — she claimed—they probably never read the entire Quran. While Bangladeshis counseled me to ignore Pakistanis from the minority Ismaili sect who claim that God is accessible through all religions, Bangladeshis themselves were ridiculed by other students for not knowing whether they were Sunni or Shi’a, two main branches of Islam. In the midst of all this I thought my call to accept these various manifestations of Islam as intriguing theological propositions went unheeded.

With my early enthusiasm and amusement depleted, I was ready to declare neutral instruction of religion in Bangladesh impossible. But over the course of the semester I could discern one positive effect of our classroom exercise: students’ increasing skepticism toward received wisdom. In becoming comfortable with challenging my explanations and debating competing religious ideas, students came to perceive any view toward religion as more an argument than an indisputable fact. They no longer accepted a truth claim at face value and analyzed its underlying logic in order to evaluate the merit of the argument. They expressed confidence in the notion that a religion could be understood in multiple ways. And all the more remarkable was their implicit decision over time to position themselves as rational thinkers and to define their religions for themselves.

An illustrative encounter took place at the shrine of the city’s most prominent Muslim saint. I, being a man, was the only one among our group to be allowed into the space. My students, the keeper of the door said, could be "impure" — menstruating — and were forbidden to enter. Instead of backing down as the local custom expected, the students ganged up on the sole guard and began a lengthy exposition on the meaning of female impurity in Islam. First they argued that a woman was impure only when she was menstruating and not at other times; they then invoked Allah as the sole witness to their cyclical impurity, a fact the guard could not be privy to and thus should not be able to use against them; and finally they made the case that if other Muslim countries left it up to individual women to decide whether to visit a mosque, it was not up to a Bangladeshi guard to create a different rule concerning entry. Besieged by a half-dozen self-styled female theologians of Islam, the man cowered, and withdrew his ban.

I was incredibly, indescribably proud of them.

Equally poignant was coming face to face with a student who asked me to interpret the will of Allah. Emanating the kind of glow only the truly faithful seem to possess, she sat herself down in my office, fixed the hijab around her round alabaster face, and quietly but measuredly confessed her crime: She had taken to praying at a Hindu temple because most local mosques did not have space for women, and she was both puzzled and elated that even in a non-Islamic space she could still sense the same divine presence she had been familiar with all her life as Allah. She asked for my guidance in resolving her crisis of faith. If other Muslims knew about her routine excursions to a Hindu temple, she would be branded an apostate, but did I think that her instinct was right, and that perhaps it was possible for Allah to communicate his existence through a temple belonging to another religion?

In the privacy of my office, I felt honored by her question. I had lectured on that very topic just before this meeting, arguing that sacred space was not the monopoly of any one religion, but could be seen as a construct contingent upon the presence of several key characteristics. This simple idea, which scholars often take for granted, had struck her as a novel but convincing explanation for her visceral experience of the Islamic divine inside a Hindu holy space. Though she had come asking for my approval of her newly found conviction, it was clear that she did not need anyone’s blessing to claim redemption. Humanistic learning had already provided her with a framework under which her religious experience could be made meaningful and righteous, regardless of what others might say.

And thanks to her and other students, I could at last define my own discipline with confidence I had until then lacked: The humanities is not just about disseminating facts or teaching interpretive skills or making a living; it is about taking a very public stance that above the specifics of widely divergent human ideas exist more important, universally applicable ideals of truth and freedom. In acknowledging this I was supremely grateful for the rare privilege I enjoyed as a teacher, having heard friends and colleagues elsewhere bemoan the difficulty of finding a meaningful career as humanists in a world constantly questioning the value of our discipline. I was humbled to be able to see, by moving to Bangladesh, that humanistic learning was not as dispensable as many charge.

But before I could fully savor the discovery that what I did actually mattered, my faith in the humanities was again put to a test when a major scandal befell my institution. I knew that as a member of this community I had to critique what was happening after all my posturing before students about the importance of seeking truth. If I remained silent, it would amount to a betrayal of my students and a discredit to my recent conclusion that humanistic endeavor is meant to make us not only better thinkers, but also more empowered and virtuous human beings.

So it was all the more crushing to be told to say nothing by the people in my very profession, whose purpose I thought I had finally ascertained. In private chats my friends and mentors in academe saw only the urgent need for me to extricate myself for the sake of my career, but had little to say about how to address the situation. Several of my colleagues on the faculty, though wonderful as individuals, demurred from taking a stance for fear of being targeted by the administration for retribution or losing the professional and financial benefits they enjoyed. And the worst blow, more so than the scandal itself, was consulting the one man I respected more than anybody else, a brilliant tenured scholar who chairs his own department at a research university in North America, and receiving this one-liner:

"My advice would be to leave it alone."

It was simultaneously flummoxing and devastating to hear a humanist say that when called to think about the real-life implications of our discipline, we should resort to inaction. And soon it enraged me that the same people who decry the dismantling of traditional academe under market pressure and changing attitudes toward higher education could be so indifferent, thereby silently but surely contributing to the collapse of humanists’ already tenuous legitimacy as public intellectuals.

While my kind did nothing of consequence, it was the students — the same students whom I had once dismissed as incapable of intellectual growth — who tried to speak up at the risk of jeopardizing the only educational opportunity they had. They approached the governing boards, the administration, and the faculty to hold an official dialogue. They considered staging a street protest. And finally, they gave up and succumbed to cynicism about higher education and the world, seeing many of their professors do nothing to live by the principles taught in class, and recognizing the humanities as exquisitely crafted words utterly devoid of substance.

As my feeling about my discipline shifted from profound grief to ecstatic revelation to acute disappointment, I was able to recall a sentiment expressed by one of my professors, who himself might not remember it after all these years. Once upon a time we sat sipping espresso on a verdant lawn not far from the main library, and he mused that he never understood why young people no longer seemed to feel outrage at the sight of injustice. He is a product of a generation that once once rampaged campuses and braved oppression by the Man. On first hearing his indictment, I was embarrassed to have failed the moral standard established by the older generation of scholars like him. But now I see that it is not just young people but much of our discipline, both young and old, that at present suffers from moral inertia. With only a few exceptions, humanists I know do not consider enactment of virtue to be their primary professional objective, whether because of the more important business of knowledge production or material exigencies of life. And I can only conclude, with no small amount of sadness, that most humanists are not, nor do they care to be, exemplary human beings.

Maybe I should move on, as did a friend and former academic who believes that the only people we can trust to stand on principle are "holy men, artists, poets, and hobos," because yes, it is true that humanists should not be confused with saints. But the humanities will always appear irrelevant as long as its practitioners refrain from demonstrating a tangible link between what they preach and how they behave. In light of the current academic penchant for blaming others for undoing the humanities, it must be said that humanists as a collective should look at themselves first, and feel shame that there is so much they can say — need to say — about the world, but that they say so little at their own expense.

After a year and a half in Bangladesh, I do not doubt any longer that the humanities matters, but now I know that the discipline’s raison d’être dies at the hands of those humanists who do not deserve their name.

Se-Woong Koo earned his Ph.D. in religious studies from Stanford University in 2011. He currently serves as Rice Family Foundation Visiting Fellow and Lecturer at Yale University.

Review of Michael Serazio, 'Your Ad Here: The Cool Sell of Guerrilla Marketing'

The most memorable thing about the 2002 science-fiction movie Minority Report was its depiction of advertising in a few decades -- in particular the scene of Tom Cruise hurrying through a mall, besieged by holographic, interactive invitations to have a Guinness or use American Express, and asking him how he liked the tank tops he’d purchased at the Gap. The virtual shills address him by name (the character’s name, that is) thanks to retinal scanners, which are as ubiquitous in the 2050s as surveillance cameras had become in the century’s first decade.

They are pop-up ads from hell, swarming like hungry ghosts to devour everyone’s attention. (The people Tom Cruise rushes past are presumably getting their own biometrically personalized shopping advice.) The scene feels uncomfortably plausible; it’s the experience of being on the Internet, extended into public space and rendered inescapable.

How effective the film is as social criticism probably depends on what you make of the fact that a quarter of its budget came from product placement. Minority Report’s critique of advertising turns out to be, in part, critique as advertising.

Now, I have some good news and some bad news. The good news is that people have become so resistant to hard-sell advertisement (dodging TV commercials with their DVRs, ignoring or mocking how ad agencies target their desires or insecurities) that they have lost influence. By the 2050s, our psychic calluses should be really thick.

The bad news concerns what is taking the place of the hard sell: a range of techniques discussed at some length in Your Ad Here: The Cool Sell of Guerrilla Advertising (New York University Press) by Michael Serazio, an assistant professor of communications at Fairfield University.

“Cool” advertising, as Serazio uses the expression, does not refer only to campaigns that make a product seem hip, hot, and happening -- so that you will be, too, by buying it. The adjective is instead a nod to the contrast between Marshall McLuhan’s famous if altogether dubious categorizations of “hot” media, such as film or print, and the “cool” sort, chiefly meaning television.

A hot medium, goes the theory, transmits its content in high resolution, so that the recipient easily absorbs it through a single sense. A cool medium, with its low resolution, demands greater involvement from the recipient in absorbing the message. Someone reading Aristotle or watching "Citizen Kane" is more or less passively taking in what the hot medium bombards the eye with, while the “Gilligan’s Island” audience finds its senses quickened (auditory and tactile in particular, according to McLuhan) by a need to compensate for the cool medium’s low level of visual stimulation.

That makes as much sense as any of the sage of Toronto’s other ideas, which is to say not a hell of a lot. Nonetheless, Serazio gets as much value out of the distinction as seems humanly possible by adapting it to the contrast between the old-school “hot” ad campaign – with its clear, strong message that you should buy Acme brand whatchamacallits, and here’s why – and a variety of newer, “cooler” approaches that are more seductive, self-effacing, or canny about dealing with widespread cynicism about corporate hype.

A cool ad campaign, when successful, does not simply persuade people to buy something but creates a kind of spontaneous, intimate involvement with the campaign itself.  The consumer’s agency is always stressed. ("Agency" in the sense of capacity to act, rather than where "Mad Men" do their business.) The Dorito’s "Fight for the Flavor" campaign of the mid-‘00s empowered the chip-gobbling public to determine which of two new flavors, Smokin' Cheddar BBQ or Wild White Nacho, would remain on the shelves and which would be pulled. Bloggers and tweeters are encouraged to express their authentic, unscripted enthusiasm. “Buzz agents” are given free samples of a product, chat it up with their friends, then report back how the discussions went. (With word-of-mouth campaigns, the most important is authenticity. Fake that and you’ve got it made.)

And at perhaps its most sophisticated level, cool advertising will cultivate the (potential) consumer’s involvement almost as an end in itself – for example, by providing an opportunity to control the behavior of a man in a chicken suit known as Subservient Chicken.

Let us return to the horrible fascination of Subservient Chicken in due course. But first, theory.

Foucault plus Gramsci equals about a third of the stuff published in cultural studies -- of which “critical industry media studies,” the subspecialty into which Serazio’s book falls, is a part. The conceptual work in Your Ad Here is done with Foucault’s line of power tools, in particular his considerations on governance, while Gramsci seems along mostly to keep him company.

Advertising as governance sounds counterintuitive, given the connotation of state power it elicits, but in Foucault’s work “government” refers to processes of guidance and control that may be more or less distant from the state’s institutions. The teacher governs a class (or tries) and a boss governs the workplace.

Over all, “management” seems like a more suitable term for most non-state modes of governance, and it has the advantage of foregrounding what Serazio wants to stress: Foucault’s point is that governance doesn’t mean giving orders and enforcing obedience but rather “structuring the possible field of action of others” in order “to arrange things in such a way that, through a certain number of means, such-and-such ends may be achieved.”

Governance (management) in this sense is a kind of effective persuasion of the governed party (the student, the fry cook, etc.) to exercise his or her agency to perform the necessary functions of the institution (school, fast-food place) without being subjected to constant external pressure. Insofar as governance is an art or a science, it is through recognizing and anticipating resistance, and preventing or containing disruption. (Some remarks by Gramsci on hegemony and resistance also apply here, but really just barely.)

“Cool sell” advertising counts as governance, in Serazio’s book, because it tries to neutralize public fatigue from advertisement overload -- so that we’re still incited to spend money and think well of a brand. That’s the common denominator of viral marketing, crowdsourced publicity campaigns, plebiscites on snack-food availability, and so on.

It occasionally sounds like Serazio is criticizing these methods as manipulative, but I suspect that’s actually high praise, like when one horror fan tells another that a torture scene in "Hostel" gave him nightmares.

Which brings us back, as promised, to Subservient Chicken, whose role in promoting the Burger King menu remains oblique at best. But he undeniably garnered an enormous amount of attention -- 20 million distinct viewers generating half a billion hits. “By filming hundreds of video clips of a man in a chicken suit,” the author says, “and writing code for a database of terms that would respond to keyword commands for the Chicken to perform those videotaped actions, [the advertising agency] concocted something that was, its own words, ‘so creepy, weird and well-executed that many people who visited… thought they were actually controlling this person in a chicken suit in real life.’ ” I can’t help feeling this calls for more extensive Foucauldian analysis, but I won’t be sticking around to see how that turns out.

 

Editorial Tags: 

review of 'Mad Men, Mad World: Sex, Politics, Style & the 1960s'

"Mad Men" returns to cable television this coming Sunday, continuing its saga of mutable identities and creative branding at a New York advertising firm during the 1960s. Or at least one assumes it will still be set in the ‘60s. How much narrative time lapses between seasons varies unpredictably. Like everything else about the show, it remains the network’s closely guarded secret. Critics given an early look at the program must agree to an embargo on anything they publish about it. This makes perfect sense in the context of the social world of "Mad Men" itself: the network is, after all, selling the audience’s curiosity to advertisers.

A different economy of attention operates in Mad Men, Mad World: Sex, Politics, Style & the 1960s, a collection of 18 essays on the program just published by Duke University Press. It’s not just a matter of the editors and contributors all being academics, hence presumably a different sort of cultural consumer from that of the average viewer. On the contrary, I think that is exactly wrong. Serialized narrative has to generate in its audience the desire for an answer to a single, crucial question: “And then what happens?” (Think of all the readers gathered at the docks in New York to get the latest installment of a Dickens novel coming from London.)

Of course, the contributors to Mad Men, Mad World write with a host of more complex questions in mind, but I don’t doubt for a second that many of the papers were initially inspired by weekend-long diegetic binge sessions, fueled by the same desire driving other viewers. At the same time, there’s every reason to think that the wider public is just as interested in the complex questions raised by the show as any of the professors writing about it. For they are questions are about race, class, gender, sexuality, politics, money, happiness, misery, and lifestyle – and about how much any configuration of these things can change, or fail to change, over time.   

Many of the essays serve as replies to a backlash against "Mad Men" that began in the third or fourth season, circa 2009, as it was beginning to draw a much larger audience than it had until that point. The complaint was that the show, despite its fanatical attention to the style, dress, and décor of the period, was simple-mindedly 21st century in its attitude toward the characters. It showed a world in which blunt expressions of racism, misogyny, and homophobia were normal, and sexual harassment in the workplace was an executive perk. Men wore hats and women stayed home.  Everyone smoked like a chimney and drank like a fish, often at the same time. Child abuse was casual. So was littering.

And because all of it was presented in tones by turn ironic and horrified, viewers were implicitly invited to congratulate themselves on how enlightened they were now. Another criticism held that "Mad Men" only seemed to criticize the oppressive arrangements it portrayed, while in reality allowing the viewer to enjoy them vicariously. These complaints sound contradictory: the show either moralistically condemns its characters or inspires the audience to wallow in political incorrectness. But they aren’t mutually exclusive by any means. What E.P. Thompson called “the enormous condescension of posterity” tends to be a default setting with Americans, alternating with periods of maudlin nostalgia. There’s no reason the audience couldn’t feel both about the "Mad Men" vision of the past.

See also a comment by the late Christopher Lasch, some 20 years ago: “Nostalgia is superficially loving in its re-creation of the past, but it invokes the past only to bury it alive. It shares with the belief in progress, to which it is only superficially opposed, an eagerness to proclaim the death of the past and to deny history’s hold on the present.”

At the risk of conflating too many arguments under too narrow a heading, I’d say that the contributors to Mad Men, Mad World agree with Lasch’s assessment of progress and nostalgia while also demonstrating how little it applies to the program as a whole.

Caroline Levine’s essay “The Shock of the Banal: Mad Men's Progressive Realism” provides an especially apt description of how the show works to create a distinct relationship between past and present that’s neither simply nostalgic nor a celebration of how far we’ve come. The dynamic of "Mad Men" is, in her terms, “the play of familiarity in strangeness” that comes from seeing “our everyday assumptions just far enough removed from us to feel distant.” (Levine is a professor of English at the University of Wisconsin at Madison.)

The infamous Draper family picnic in season two is a case in point. After a pleasant afternoon with the kids in a bucolic setting, the parents pack up their gear, shake all the garbage off their picnic blanket, and drive off. The scene is funny, in the way appalling behavior can sometimes be, but it’s also disturbing. The actions are so natural and careless – so thoughtless, all across the board – that you recognize them immediately as habit. Today’s viewers might congratulate themselves for at least feeling guilty when they litter. But that’s not the only possible response, because the scene creates an uneasy awareness that once-familiar, “normal” ideas and actions came to be completely unacceptable – within, in fact, a relatively short time. It eventually became the butt of jokes, but the famous “Keep America Beautiful” ad from about 1970 -- the one with the crying Indian -- probably had a lot to do with it. (Such is the power of advertising.)

The show's handling of race and gender can be intriguing and frustrating. All the powerful people in it are straight white guys in ties, sublimely oblivious to even the possibility that their word might not be law. "Mad Space" by Dianne Harris, a professor of architecture and art history at the University of Illinois at Urbana-Champaign, offers a useful cognitive map of the show's world -- highlighting how the advertising firm's offices are organized to demonstrate and reinforce the power of the executives over access to the female employees' labor (and, often enough, bodies), while the staid home that Don Draper and his family occupy in the suburbs is tightly linked to the upper-middle-class WASP identity he is trying to create for himself by concealing and obliterating his rural, "white trash" origins. A handful of African-American characters appear on the margins of various storylines -- and one, the Drapers' housekeeper Carla, occupies the especially complex and fraught position best summed up in the phrase "almost part of the family." But we never see the private lives of any nonwhite character.

In "Representing the Mad Margins of the Early 1960s: Northern Civil Rights and the Blues Idiom," Clarence Lang, an associate professor of African and African-American studies at the University of Kansas, writes that "Mad Men" "indulges in a selective forgetfulness" by "presuming a black Northern quietude that did not exist" (in contrast to the show's occasional references to the civil rights movement below the Mason-Dixon line). Lang's judgment here is valid -- up to a point. As it happens, all of the essays in the collection were written before the start of the fifth season, in which black activists demonstrate outside the firm's building to protest the lack of job opportunities. Sterling Cooper Draper Pryce hires its first African-American employee, a secretary named Dawn. I think a compelling reading of "Mad Men" would recognize that the pace and extent of the appearance of nonwhite characters on screen is a matter not of the creators' refusal to portray them, but of their slow arrival on the scene of an incredibly exclusionary social world being transformed (gradually and never thoroughly) by the times in which "Mad Men" is set.   

There is much else in the book that I found interesting and useful in thinking about "Mad Men," and I think it will be stimulating to readers outside the ranks of aca fandom. I’ll return to it in a few weeks, with an eye to connecting some of the essays to new developments at Sterling Cooper Draper Pryce. (Presumably the firm will have changed its name in the new season, given the tragic aftermath of Lane Pryce’s venture in creative bookkeeping.)

When things left off, it was the summer of 1967. I have no better idea than any one else when or how the narrative will pick up, but really hope that Don Draper creates the ad campaign for Richard Nixon.

 

Editorial Tags: 

NHA speakers implore humanities scholars to fight for their fields

Smart Title: 

Advocates for the humanities search for the arguments to win federal support, and to stop having their disciplines treated "like a piñata."

Putting the black studies debate into perspective (essay)

Intellectual Affairs

For a week now, friends have been sending me links from a heated exchange over the status and value of black studies. It started among bloggers, then spilled over into Twitter, which always makes things better. I'm not going to rehash the debate, which, after all, is always the same. As with any other field, black studies (or African-American studies, or, in the most cosmopolitan variant, Africana studies) could only benefit from serious, tough-minded, and ruthlessly intelligent critique. I would be glad to live to see that happen.

But maybe the rancor will create some new readers for a book published five years ago, From Black Power to Black Studies: How a Radical Social Movement Became an Academic Discipline (Johns Hopkins University Press) by Fabio Rojas, an associate professor of sociology at Indiana University. Someone glancing at the cover in a bookstore might take the subtitle to mean it's another one of those denunciations of academia as a vast liberal-fascist indoctrination camp for recruits to the New World Order Gestapo. I don't know whether that was the sales department's idea; if so, it was worth a shot. Anyway, there the resemblance ends. Rojas wrote an intelligent, informed treatment of black studies, looking at it through the lens of sociological analysis of organizational development, and with luck the anti-black-studies diatribalists will read it by mistake and accidentally learn something about the field they are so keen to destroy. (Spell-check insists that “diatribalists” is not a word, but it ought to be.)

Black studies was undeniably a product of radical activism in the late 1960s and early ‘70s. Administrators established courses only as a concession to student protesters who had a strongly politicized notion of the field’s purpose. “From 1969 to 1974,” Rojas writes, “approximately 120 degree programs were created,” along with “dozens of other black studies units, such as research centers and nondegree programs,” plus professional organizations and journals devoted to the field.

But to regard black studies as a matter of academe becoming politicized (as though the earlier state of comprehensive neglect wasn’t politicized) misses the other side of the process: “The growth of black studies,” Rojas suggests, “can be fruitfully viewed as a bureaucratic response to a social movement.” By the late 1970s, the African-American sociologist St. Clair Drake (co-author of Black Metropolis, a classic study of Chicago to which Richard Wright contributed an introduction) was writing that black studies had become institutionalized “in the sense that it had moved from the conflict phase into adjustment to the existing educational system, with some of its values accepted by that system…. A trade-off was involved. Black studies became depoliticized and deradicalized.”

That, too, is something of an overstatement -- but it is far closer to the truth than denunciations of black-studies programs, which treat them as politically volatile, yet also as well-entrenched bastions of power and privilege. As of 2007, only about 9 percent of four-year colleges and universities had a black studies unit, few of them with a graduate program. Rojas estimates that “the average black studies program employs only seven professors, many of whom are courtesy or joint appointments with limited involvement in the program” -- while in some cases a program is run by “a single professor who organizes cross-listed courses taught by professors with appointments in other departments.”

The field “has extremely porous boundaries,” with scholars who have been trained in fields “from history to religious studies to food science.” Rojas found from a survey that 88 percent of black studies instructors had doctoral degrees. Those who didn’t “are often writers, artists, and musicians who have secured a position teaching their art within a department of black studies.”

As for faculty working primarily or exclusively in black studies, Rojas writes that “the entire population of tenured and tenure-track black studies professors -- 855 individuals -- is smaller than the full-time faculty of my own institution.” In short, black studies is both a small part of higher education in the United States and a field connected by countless threads to other forms of scholarship. The impetus for its creation came from African-American social and political movements. But its continued existence and development has meant adaptation to, and hybridization with, modes of enquiry from long-established disciplines.

Such interdisciplinary research and teaching is necessary and justified because (what I am about to say will be very bold and very controversial, and you may wish to sit down before reading further) it is impossible to understand American life, or modernity itself, without a deep engagement with African-American history, music, literature, institutions, folklore, political movements, etc.

In a nice bit of paradox, that is why C.L.R. James was so dubious about black studies when it began in the 1960s. As author of The Black Jacobins and The History of Negro Revolt, among other classic works, he was one of the figures students wanted to be made visiting professor when they demanded black studies courses. But when he accepted, it was only with ambivalence. "I do not believe that there is any such thing as black studies," he told an audience in 1969. "...I only know, the struggle of people against tyranny and oppression in a certain social setting, and, particularly, the last two hundred years. It's impossible for me to separate black studies and white studies in any theoretical point of view."

Clearly James's perspective has nothing in common with the usual denunciations of the field. The notion that black studies is just some kind of reverse-racist victimology, rigged up to provide employment for "kill whitey" demagogues, is the product of malice. But it also expresses a certain banality of mind -- not an inability to learn, but a refusal to do so. For some people, pride in knowing nothing about a subject will always suffice as proof that it must be worthless.

Review of Orin Starn, "The Passion of Tiger Woods"

Intellectual Affairs

On the Friday following Thanksgiving in 2009, Tiger Woods had an automobile accident. For someone who does not follow golf, the headlines that ran that weekend provided exactly as much information as it seemed necessary to have. Over the following week, I noticed a few more headlines, but they made no impression. Some part of the brain is charged with the task of filtering the torrent of signals that bombard it from the media every day. And it did its job with reasonable efficiency, at least for a while.

Some sort of frenzy was underway. It became impossible to tune this out entirely. I began to ignore it in a more deliberate way. (All due respect to the man for his talent and accomplishments, but the doings of Tiger Woods were exactly as interesting to me as mine would be to him.) There should be a word for the effort to avoid giving any attention to some kerfuffle underway in the media environment. “Fortified indifference,” perhaps. It’s like gritting your teeth, except with neurons.

But the important thing about my struggle in 2009 is that it failed. Within six weeks of the accident, I had a rough sense of the whole drama in spite of having never read a single article on the scandal, nor watched nor listened to any news broadcasts about it. The jokes, allusions, and analogies spinning off from the event made certain details inescapable. A kind of cultural saturation had occurred. Resistance was futile. The whole experience was irritating, even a little depressing, for it revealed the limits of personal autonomy in the face of an unrelenting media system, capable of imposing utterly meaningless crap on everybody’s attention, one way or another.

But perhaps that’s looking at things the wrong way. Consider the perspective offered by Orin Starn in The Passion of Tiger Woods: An Anthropologist Reports on Golf, Race, and Celebrity Scandal (Duke University Press). Starn, the chair of cultural anthropology at Duke, maintains that the events of two years back were not meaningless at all. If anything, they were supercharged with cultural significance.

The book's title alludes to the theatrical reenactments of Christ’s suffering performed at Easter during the middle ages, or at least to Mel Gibson’s big-screen rendition thereof. Starn interprets “Tigergate” as an early 21st-century version of the scapegoating rituals analyzed by René Girard. From what I recall of Girardian theory, the reconsolidation of social order involves the scapegoat being slaughtered, rather than paying alimony, though in some cases that may be too fine a distinction.

The scandal was certainly louder and more frenetic than the game that Woods seems have been destined to master. The first image of him in the book shows him at the age of two, appearing on "The Mike Douglas Show" with his father. He is dressed in pint-sized golfing garb, with a little bag of clubs over his shoulder. As with a very young Michael Jackson, the performance of cuteness now reads as a bit creepy. Starn does not make the comparison, but it’s implicit, given the outcome. “This toddler was not to be one of those child prodigies who flames out under unbearable expectations,” Starn writes. “By his early thirties, he was a one-man multinational company…. Forbes magazine heralded Woods as the first athlete to earn $1 billion.”

Starn, who mentions that he is a golfer, is also a scholar of the game, which he says “has always traced the fault lines of conflict, hierarchy, and tension in America, among them the archetypal divides of race and class.” To judge by my friend Dave Zirin’s book A People’s History of Sports in the United States (The New Press) that’s true of almost any athletic pursuit, even bowling. But the salient point about Woods is that most of his career has been conducted as if no such fault lines existed. Starn presents some interesting and little-known information on how golf was integrated. But apart from his genius on the green, Woods’s “brand” has been defined by its promise of harmony: “He and his blonde-haired, blue-eyed wife, Elin Nordegren, seemed the poster couple for a shiny new postracial America with their two young children, two dogs, and the fabulous riches of Tiger’s golfing empire.”

Each of his parents had a multiracial background -- black, white, and Native American on his father’s side; Chinese, Thai, and Dutch on his mother’s. “Cablinasian,” the label Woods made up to name his blended identity, is tongue-in-cheek, but it also represents a very American tendency to mess with the established categories of racial identity by creating an ironic mask. (Ralph Ellison wrote about in his essay “Change the Joke and Slip the Yoke.”)

But that mask flew off, so to speak, when his car hit the fire hydrant in late 2009. Starn fills out his chronicle of the scandal that followed with an examination of the conversation and vituperation that took place online, often in the comments sections of news articles -- with numerous representative samples, in all their epithet-spewing, semiliterate glory. The one-drop rule remains in full effect, it seems, even for Cablinasians.

“For all the ostensible variety of opinion,” Stern writes about the cyberchatter, “there was something limited and predictable about the complaints, stereotypes, and arguments and counterarguments, as if we were watching a movie we’d already seen many times before. Whether [coming from] the black woman aggrieved with Tiger about being with white women or the white man bitter about supposed black privilege, we already knew the lines, or at least most of them.… We are all players, like it or not, in a modern American kabuki theater of race, where our masks too often seem to be frozen into a limited set of expressions.”

Same as it ever was, then. But this is where the comparison to a scapegoating ritual falls apart. (Not that it’s developed very much in any case.) At least in Girard’s analysis, the ritual is an effort to channel and purge the conflicts within a society – reducing its tensions, restoring its sense of cohesion and unity, displacing the potential for violence by administering a homeopathic dose of it. Nothing like that can be said to have happened with Tigergate. It involved no catharsis. For that matter, it ran -- by Starn’s own account -- in exactly the opposite direction: the golfer himself symbolized harmony and success and the vision of historical violence transcended with all the sublime perfection of a hole-in-one. The furor of late 2009 negated all of that. The roar was so load that it couldn’t be ignored, even if you plugged your ears and looked away.

The latest headlines indicate that Tiger Woods is going to play the Pebble Beach Pro-Am tournament next month, for the first time in a decade. Meanwhile, his ex-wife has purchased a mansion for $12 million and is going to tear it down. She is doing so because of termites, or so go the reports. Hard to tell what symbolic significance that may have. But under the circumstances, wiping out termites might not be her primary motivation for destroying something incredibly expensive.

Ryan Gosling pick-up line meme reaches academe

Section: 
Smart Title: 

Satirical blogs explore whether a Hollywood sex symbol can make academic pick-up lines seem smooth.

Pages

Subscribe to RSS - Cultural studies
Back to Top