During a late and tense scene in Hitchcock (2012) -- the biopic with Anthony Hopkins in the title role, centering on the troubled making of Psycho -- we see the director’s agent suggest one way to avert the disaster of being stuck with a film that neither the studio nor the censor will approve: edit it to run as a two-part episode of "Alfred Hitchcock Presents," his successful and lucrative television series.
The director brushes off the proposal irritably, and that’s that. I find no reference to the incident in Stephen Rebello’s comprehensive Alfred Hitchcock and the Making of 'Psycho', so it is likely to be a screenwriter’s liberty for dramatic effect. The very idea of butchering one of the director’s most carefully constructed works for TV is as horrifying as any of his own stories involving mutilation or cannibalism.
But the cultural critic Dwight Macdonald, writing for Esquire in 1960, considered the film and the program all too similar: “Psycho is merely one of those television shows padded out to two hours by adding pointless subplots and realistic detail…. All in all, a nasty little film.” That judgment seems to have been typical of Hitchcock’s critical reputation at the time, at least in the United States. His talent, while indisputable, was in decline, and if his silly TV spots were not the cause, they certainly weren’t helping.
No such invidious comparisons occurred to the critics and filmmakers in France around the journal Cahiers du Cinema, which regarded Hitchcock as the consummate film artist, with Psycho as one more masterpiece. Questions of taste can never be settled definitively, but the opinions of cineastes and ordinary viewers alike have tended to skew overwhelmingly Cahiers-ward, with "Alfred Hitchcock Presents" now seeming about as relevant to Psycho’s status in film history as the bear-baiting pit at the Globe Theatre does to interpreting Hamlet.
So there’s something offbeat about Jan Olsson’s Hitchcock à la Carte (Duke University Press), a study that disregards not just the differences between film and video but between the director’s creative work and his public persona. Olsson, a professor of cinema studies at the University of Stockholm, insists on examining Hitchcock’s body of work through -- or at least around -- his body proper.
What swims, whalelike, into view in Olsson’s study is Hitchcock’s massive cultural presence. And yes, those were fat jokes. Stupid ones, too, but par for the course, given that critic-speak abstractions regarding “the body” here give way to considerations of a real body that went from more than 300 pounds to under 200 in a single year, before bouncing back up and plunging back down, repeatedly. However extraneous the director’s girth may seem to his art, Olsson treats them as combining in the public eye to establish the composite phenomenon we know as Hitchcock.
Many viewers become aware of his body and his corpus at almost the same time, by keeping an eye out for the walk-on parts in his films, where the director appears as a figure glimpsed off to the side or in the background. His appearances can be taken as both an inside joke and a personal signature, but they also reinforce a tendency going back at least to 1937, when he arrived in New York for “gastronomic holiday” while en route to the meetings in Hollywood. Embarked on a new phase in his career, Hitchcock turned his heft into a kind of social capital, something to joke about. Eating became a major part of his self-branding, as it’s put nowadays.
Hitch (the nickname was part of the brand) gave interviews to reporters while eating a steak or two, followed by ice cream. He blunted the barbs about his weight by joking about it himself: “I’m not really a heavy eater, unless you mean that I’m heavy, and I eat.” Flaunting his gluttony in the face of American puritanism, he also served up quips about the entertainment value of murder. Audiences learned to connect the mordant tone of his films to a personality that was, in all respects, bigger than life.
By the time Hitchcock made the transition to television in the 1950s, his persona was well established and, quite literally, scripted, with the comedy writer James B. Allardice turning out scores of skits and monologues in which Hitch poked fun at the sponsors while introducing the week’s episode. The gags often turned on his girth or his appetite, while the stories themselves often incorporated food or meals as a macabre plot point: a dining club whose delicacies include meat dishes prepared from recently murdered members, for example, or a frozen leg of lamb used as a murder weapon, then cooked and served to the policemen investigating the crime.
That blend of morbidity and sly humor is a large part of what we mean in calling something “Hitchcockian,” although Olsson also regards it as Bakhtinian: the films and escapades being examples the “carnivalized discourse” that Mikhail Bakhtin analyzed in his study of Rabelais.
Olsson has turned up an extraordinary array of photographs, interviews and publicity events that ran parallel to Hitchcock's output as a director, including work in a long-forgotten genre called “photocrime,” in which a crime story was told using a series of staged photographs. (Created in England and popularized in the United States by Life magazine, it seems to have been a prototype of the Latin-American fotonovela.) He takes all of this highly miscellaneous material as instances of paratext -- the layers of material surrounding the author’s (in this case, director’s) work, through which the reader/viewer passes in gaining access.
The echoes and cross-references between paratext and the films are interesting if, in many cases, likely to be broadly familiar to the Hitchcock viewer. The director’s name, Olsson writes, “just like ‘Salvador Dali’ and ‘Andy Warhol,’ represents intangibles beyond the oeuvre; it is a convoluted bricolage of art, commerce, marketing and celebrity indicative of 20th-century media culture at large.”
The author’s approach more or less precludes judgments of quality or a devotee’s attention to the particulars of artistry. And that’s okay -- the book is eye-opening on its own terms. But there’s a reason why some of us will watch Psycho a hundred times before we die, and I suspect it has less to do with Hitchcock’s body, as such, than with one part of it: his eye.
Ask anyone professing the humanities today and you come to understand that a medieval dimness looms. If this is the end-times for the ice sheets at our poles — and it is — many of us also understand that the melt can be found closer to home, in the elimination of language and classics departments, for instance, and in the philistinism represented by governors such as Rick Scott of Florida and Patrick McCrory of North Carolina, who apparently see in the humanities a waste of time and taxpayer subsidies. In the name of efficiency and job creation, according to their logic, taxpayers can no longer afford to support bleary-eyed poets, Latin history radicals, and brie-nibbling Francophiles.
That there is a general and widespread acceptance in the United States that what is good for corporate America is good for the country is perhaps inarguable, and this is why men like Governors Scott and McCrory are dangerous. They merely invoke a longstanding and not-so-ugly stereotype: the pointy-headed humanist whose work, if you can call it that, is irrelevant. Among the many easy targets, English departments and their ilk are convenient and mostly defenseless. Few will rise to rush the barricades with us, least of all the hard-headed realists who understand the difficulties of running a business, which is what the university is, anyway.
I wish, therefore, to propose a solution that will save money, save the humanities, and perhaps make the world a better place: Close the business schools.
The Market Argument
We are told that something called “the market” is responsible for the great disparities in pay between humanities professors and business professors. To a humanist, however, this market is the great mystifier; we find no evidence of an “invisible hand” that magically allocates resources within the university. The market argument for pay differentials between business professors and historians (average pay in 2014 for full professors at all institutions: $123,233 and $86,636, respectively, a difference of almost 30 percent; average at research institutions is $160,705 and $102,981, a difference of 36 percent), for instance, fails to convince that a market is operating. This is because administrators and trustees who set salaries based upon what the market can bear, or what it calls for, or what it demands, are actually subsidizing those of us who are who are manifestly out of the market.
Your average finance professor, for instance, is not a part of this market; indeed, she is a member of the artificial market created by colleges and universities themselves, the same institutions that tout the importance of critical thinking and of creating the well-rounded individual whose liberal arts study will ostensibly make her into a productive member of our democracy. But the administrators who buy the argument that the market allocates upward of 20, 30, or 40 percent more for the business professor than it does her colleague in the humanities have failed to be the example they tout: they are not thinking.
The higher education market for business professors and legal scholars, for instance, is one in which the professor is paid as if she took her services and sold them on what is commonly call the market. Which is where she, and her talents, manifestly are not. She is here, in the building next to ours, teaching our students and doing the same work we are. If my daughter cuts our lawn, she does not get paid as if she were cutting the neighbor’s lawn.
The business professor has sacrificed the blandishments of the other market for that of the university, where she can work softer hours, have her December/January vacation, go to London during the summer on a fellowship or university grant, and generally live something approaching the good life — which is what being employed by a college or university allows the lucky who earn tenure. She avoids the other market — eschews the long hours in the office, the demands of travel, the oppressive corporate state — so that she can pick up her kids from school on occasion, sleep in on a Saturday, and turn off her smartphone. She may be part of a machine, but it is a university machine, and as machines go she could do worse. This “market” is better than the other one.
But does she bring more value to the university? Does she generate more student hours? These are questions that administrators and business professors do not ask. Why? Because they wouldn’t like the answers. They would find that she is an expensive acquisition. Unless she is one of the Wharton superstars and appears on CNN Money and is quoted in The Wall Street Journal, there’s a good chance that the university isn’t getting its money’s worth.
The Moral Argument
There is another argument for wishing our business professor adieu. She is ostensibly training the next crop of financiers and M.B.A.s whose machinations have arguably had no salutary effects on this democracy. I understand that I am casting a wide net here, grouping the good with the bad, blaming the recent implosion of the world economy on business schools. One could, perhaps, lay equal blame on the mathematicians and quantitative analysts who created the derivative algorithms and mortgage packages that even the M.B.A.s themselves don’t understand, though there’s a good chance that business school graduates hired these alpha number crunchers.
Our investment bankers and their ilk will have to take the fall because, well, they should have known better. If only because, at bottom, they are responsible — with their easy cash and credit, their drive-through mortgages, and, worst of all, their betting against the very system they knew was hopelessly constructed. And they were trained at our universities, many of them, probably at our best universities, the Harvards and Princetons and Dartmouths, where — it is increasingly apparent — the brightest students go to learn how to destroy the world.
I am not arguing that students shouldn’t take classes in accounting, marketing, and economics. An understanding of these subjects holds value. They are honorable subjects often horribly applied. In the wrong hands they become tools less of enlightenment and liberation than ruthless self-interest. And when you have groups of like-minded economic pirates banding together in the name of self-interest, they form a corporation, that is, a person. That person, it is now apparent, cannot be relied upon to do the right thing; that person cannot be held accountable.
It’s not as if this is news. Over 150 years ago, Charles Dickens saw this problem, and he wrote A Christmas Carol to address it. The hero of Dickens’s novella is Jacob Marley, who returns from the grave to warn his tightfisted partner Ebenezer Scrooge that he might want to change his ways. When Scrooge tells Marley that he was always a “good man of business,” Marley brings down the thunder: “Mankind was my business. The common welfare was my business; charity, mercy, forbearance, and benevolence, were, all, my business. The dealings of my trade were but a drop of water in the comprehensive ocean of my business!”
In closing the business schools, may the former professors of finance bring to the market a more human side (or, apropos of Dickens, a more ghostly side). Whether or not they do, though, closing the business schools is a necessary first step in righting the social and economic injustices perpetuated not by capitalism but by those who have used it to rend the very social fabric that nourishes them. By planting the seeds of corporate and financial tyranny, our business schools, operating as so many of them do in collusion with a too-big-to-fail mentality, have become the enemy of democracy. They must be closed, since, as Jacob Marley reminds us, we all live in the business world.
II. Save the Humanities
Closing the business schools will allow us to turn our attention more fully to the state of the humanities and their apparent demise. The 2013 report released by the American Academy of Arts and Sciences, which asserts that “the humanities and social sciences are not merely elective, nor are they elite or elitist. They go beyond the immediate and instrumental to help us understand the past and the future.” As if that’s going to sell.
In the wake of the academy’s report, The New York Times dutifully ran three columns on the humanities — by David Brooks, Verlyn Klinkeborg, and Stanley Fish — which dove into the wreck and surveyed the damage in fairly predictable ways (excepting Fish, whose unpredictability is predictable). Brooks remembers when they used to teach Seneca and Catullus, and Klinkeborg looks back on the good old days when everyone treasured literature and literary study. Those days are gone, he argues, because “the humanities often do a bad job of teaching the humanities,” and because “writing well used to be a fundamental principle of the humanities,” though it apparently is not anymore. Why writing well isn’t a fundamental principle of life is perhaps a better question.
We might therefore ask: Aside from the typical obeisance to something called “critical thinking,” what are the humanities supposed to do?
I propose that one of the beauties of the liberal arts degree is that it is meant to do nothing. I would like to think, therefore, that the typical humanities major reads because she is interested in knowledge for purposes outside of the pervasive instrumentalism now fouling higher education. She does not read philosophy because she wants, necessarily, to become a philosopher; she does not read poetry to become a poet, though she may dream of it; she does not study art history, usually, to become an art historian, though she may one day take this road.
She may be in the minority, but she studies these subjects because of the pleasure it gives her. Reading literature, or studying philosophy, or viewing art, or watching films — and thinking about them — are pleasurable things. What a delight to subsidize something that gives her immediate and future joy instead of spending capital on a course of study that might someday allow her to make more money so that she can do the things she wants to do at some distant time. Henry David Thoreau said it best: “This spending of the best part of one's life earning money in order to enjoy a questionable liberty during the least valuable part of it reminds me of the Englishman who went to India to make a fortune first, in order that he might return to England and live the life of a poet. He should have gone up garret at once.” If you want to be a poet, be done with it.
Does she suffer for this pleasure?
It is an unfortunate fact of our political and cultural economy that she probably does. Her parents wonder helplessly what she is up to and they threaten to cut off her tuition unless she comes to her senses. The governor and legislature of her state tell her that she is wasting her time and that she is unemployable. She goes to her advisers, who, if they are in the humanities, tell her that the companies her parents revere love to hire our kind, that we know how to think critically and write clearly and solve problems.
And it isn’t that they are lying, exactly (except to themselves). They simply aren’t telling her the whole truth: that she will almost surely never have the kind of financial success that her peers in business or engineering or medicine will have; that she will have enormous regrets barely ameliorated by the thought that she carries the fire; that the digital humanities will not save her, either, though they may help make her life slightly more interesting.
It is with this problem in mind that I argue for a vision of the university as a place where the humanities are more than tolerated, where they are celebrated as intrinsic to something other than vocationalism, as a place in which the ideology that inheres to the industrial model in all things can and ought to be dismantled and its various parts put back together into something resembling a university and not a factory floor.
Instead of making the case that the humanities gives students the skills to “succeed in a rapidly changing world,” I want to invoke the wisdom of Walt Whitman, one of the great philosophers of seeming inactivity, who wrote: “I lean and loafe at my ease observing a spear of summer grass.”
What does it mean to loafe? Whitman is reclining and relaxing, but he is also active: he “invites” his soul and “observes” the world around him. This conjunction of observation and contemplation with an invitation to the soul is the key here; using our time, energy, and intellectual faculties to attend to our world is the root of successful living. A world of contemplative loafers is one that can potentially make clear-eyed moral and ethical judgments of the sort that we need, judgments that deny the conflation of economic value with other notions of value.
Whitman would rather hang out with the men who brought in the catch than listen to the disputations of science or catch the fish himself: “You should have been with us that day round the chowder-kettle.” While I am not necessarily advocating a life of sloth, I’m not arguing against it, either. I respect the art of study for its own sake and revere the thinker who does nothing worthwhile, if by worthwhile we mean something like growing the economy. Making a living rather than living is the sign of desperation.
William Major is professor of English at Hillyer College of the University of Hartford. He is author of Grounded Vision: New Agrarianism and the Academy (University of Alabama Press, 2011).
Perhaps you’ve heard of Rule 34. It expresses one of the core imperatives of 21st-century culture: “If something exists, there is porn about it. If no porn is found at the moment, it will be made. There are no exceptions.”
Consider, for example, the subculture devoted to eroticizing the My Little Pony cartoon characters. More people are into this than you might imagine. They have conventions. It seems likely that even more specialized niches exist -- catering to tastes that “vanilla” My Little Pony fetishists regard as kinky -- although I refuse to investigate the matter.
Consider it a topic for future issues of Porn Studies, a new journal published by Routledge. “Just as there are specialist journals, conferences, book series, and collections enabling consideration of other areas of media and cultural production,” says the introductory note for the inaugural double issue, “so pornography needs a dedicated space for research and debate.” (Last year, many people disagreed: news of the journal inspired much protest, as Inside Higher Ed reported.)
The most interesting thing about that sentence from the journal's editors is that “pornography” functions in it as an active subject. Porn is figured almost as an institution or a conscious entity — one capable of desiring, even demanding, scholarly recognition. The satirical Rule 34 comes very near to claiming agency for porn. With Porn Studies, there is no such ambiguity about the sheer world-making power of pornography.
It’s not just that the journal acknowledges the porn industry as an extremely profitable and expanding sector of the economy, or as a cultural force with an influence spreading all over the map. That much is a commonplace, and Porn Studies take it as a given. But something more happens in the pages of Porn Studies: academic discourse about porn turns into one more manifestation of its power.
One recent call for papers refers to “the emerging field of porn studies” — a piece of academic-entrepreneurial boilerplate that proves significant without actually being true.
It’s now a solid 10 years since Duke University Press published a volume of some 500 pages, also called Porn Studies, edited by Linda Williams, whose Hard Core: Power, Pleasure, and the “Frenzy of the Visible” (University of California Press, 1989) is by far the most-cited book in the new journal.
She wrote it amid the drawn-out and exhausting battles of the 1980s, when an uneasy alliance formed between radical feminists, rallying under the slogan “pornography is the theory, rape is the practice," and the religious right, which wanted to enforce the sexual “Thou shalt nots” by law. On the other side of the barricades were the "sex-positive” feminists and civil libertarians, who were not necessarily pro-porn so much as anti-censorship.
Hard Core went beyond the polemics, or around them. Williams approached the X-rated films of the 1970s and ‘80s with as much critical sophistication and command of the history of film as other scholars more typically brought to the cinematography of Eisenstein or Hitchcock. She didn’t deny the misogyny that appeared on screen but saw other forces at work as well -- including scenarios in which women were sexually exploratory or assertive in ways that the old phallic order couldn’t always predict or satisfy.
To anti-porn activists, whether feminist or fundamentalist, it went without saying that the market for pornography consisted of heterosexual men. Likewise, the heterosexual-male nature of the “gaze” in cinema was virtually an axiom of feminist film theory. Williams challenged both suppositions. Women became an ever more substantial share of the audience, especially after videotape made it possible to watch at home.
The status of Williams’s work as foundational suggests that porn studies began “emerging” at least a quarter century ago. Recent volumes of papers such as C'Lick Me: A Netporn Studies Reader (Institute for Network Cultures, 2007) and Hard to Swallow: Hard Core Pornography on Screen (Wallflower, distributed by Columbia University Press, 2012) take the field as growing but established.
For that matter, porn-studies scholars would have every right to claim ancestors working long before the Motion Picture Association of America invented the X rating. In The Horn Book: Studies in Erotic Folklore and Bibliography (1963), Gershon Legman surveys about two centuries’ worth of secondary literature, in several languages. The contributors launching the new journal do not cite Legman, much less any of the figures he discusses, even once. Nor does a single paper discuss any form of pornography that existed prior to the advent of video and digital forms of distribution.
The body of commentary and analysis predating Hard Core includes psycho- and sociological research, legal debate, and humanistic work in a variety of fields. It is seldom mentioned, except when dismissed as simplistic, under-theorized, or hopelessly in thrall to moralistic or ideological assumptions rendering its questions, let alone its arguments, highly suspect.
“Porn studies,” in other words, is not synonymous with scholarship about pornography, as such. It is its own demarcated zone of discussion, one that is present-minded and digital media-oriented to an extreme. (All of the double issue is available to the public here.)
An exemplary case is the paper called "Gonzo, trannys, and teens – current trends in U.S. adult content production, distribution, and consumption” by Chauntelle Anne Tibbals, who is identified as an independent scholar from Los Angeles. It is perhaps more sociological in perspective than an article from a porn-industry trade journal, and like other papers in Porn Studies it shies away from generalization even when inching in that direction:
"Some performers, producers, and others call for ‘feminist porn’ – a self-identified genre and social movement with no one articulated definition. At the same time, many producers and performers reject the attribution while creating content that seems decidedly feminist. At the center of every one of these debates are porn performers themselves, each of whom are impacted by individual choice, market concerns, and representation. ... Even if one was to focus only on the images contained in ‘pornographic’ representations, with no consideration of production processes or variations in reception, we would still be left with a vast and diverse body of work that is constantly shifting. Consequently, there is no way to say ‘pornography is this’ or ‘pornography is that’ – as I have done in this essay, all one can really do is attempt to describe and contextualize existing patterns as they currently resonate (in this case, with me).”
You cannot step into the same porno river twice. Even so, Lynn Comella’s “Studying Porn Cultures” calls for researchers to spend more time studying the performers, marketers, and fans in their native element, such as the Adult Entertainment Expo. (Other “data-rich field sites” range “from erotic film festivals to feminist porn sets to adult video stores.”)
Comella, an assistant professor of women’s studies at the University of Nevada at Las Vegas, writes that she has attended the Expo “every year since 2008, as a researcher, a credentialed member of the media, and an invited participant in the Expo’s popular seminar series,” twice serving as moderator for the session devoted to women and the adult entertainment market.
The practice of “porn-studies-in-action” she advocates is “accountable to cultural plurality, specificity, and nuance,” she writes, and “rejects sweeping generalizations and foregone conclusions that rely on preconceived notions about pornography’s inherent ’truths’ and effects."
The problem with that being that nobody ever intentionally accepts “sweeping generalizations and foregone conclusions that rely on preconceived notions.” If only it were that easy. One era's critical perspective can become the next's tacit presupposition. In Hard Core, Linda Williams challenged the assumption that the pornographic film was one big homogenous set of images and messages created to stroke the egos and stoke the libidos of straight white guys. By contrast, the papers in Porn Studies all take Williams’s interpretive stance as a given: the audience, product, meanings, and the effects of porn are intrinsically heterogeneous and in flux. Any general statement beyond “More research is needed” thus becomes highly problematic..
A BBC documentary from a few years ago included a segment on one of the better-known subgenres of recent times. In it, the porn director (who is also the star) has a performer who is young, but of legal age, dress up as if she were a schoolgirl. He then brutalizes them at length with slapping, gagging, abusive penetration, and a running stream of verbal humiliation, after which he and other men urinate on her.
The documentary crew follows an actress to the set, with the camera focusing in very closely when the male performer begins chatting up the actress as the scenario begins. The expression on his face is chilling. Ted Bundy must have gotten that look in his eyes once the victim was handcuffed.
Given that his video product, too, is part of the diverse and polymorphous carnival that is the adult entertainment industry — and not the least profitable part, by any means — I would have liked to see a paper in Porn Studies that asked about damage. So many of the contributors celebrate the feminist and LGBT-positive aspect of the industry that a naive reader would think nothing else sold, and that it exists solely to increase the sum of happiness in the world. This may be doubted; indeed, it must be. At times, the journal seems not just to analyze the world of porn but to be part of it. Not in the way the performers are, by any means, but perhaps as a sort of conceptual catering service.
For some reason I have become aware that it is possible to take photographs of bass guitar players in mid-performance and, by digital means, to replace their instruments with dogs, so that it then appears the musicians (who very often wear a facial expressions suggesting rapture or deep concentration) are tickling the dogs. Yes, yes it is.
I am not proud of this knowledge and did not seek it out, and would have forgotten about it almost immediately if not for something else occupying my attention in the past few days: a couple of new books treating the phenomenon with great and methodical seriousness. Not, of course, the dog-tickling bass player phenomenon as such, but rather, the kind of online artifact indicated by the titles of Karine Nahon and Jeff Hemsley’s Going Viral (Polity) and Limor Shifman’s Memes in Digital Culture (MIT Press).
The authors differentiate between the topics of the two volumes. Despite a common tendency to equate them, memes don’t always “go viral.” Things that do (say, video shot during a typhoon, uploaded while the disaster is still under way) are not always memes. The distinction will be clarified shortly -- and there is indeed some value in defining the contrast. It corresponds to different kinds of behavior or, if you prefer, different ways of mediating social and cultural life by means of our all-but-inescapable digital device.
Still, the line should be drawn only just so sharply. It seems bright and clear when the authors bring their different methods (one more quantitative than qualitative and vice versa) to the job. I don’t mean that the difference between viral and memetic communication is simply one of perspective. It seems to exist in real life. But so does their tendency to blur.
“Virality,” write Nahon and Hemsley in a definition unlikely to be improved upon, “is a social information flow process where many people simultaneously forward a specific information item, over a short period of time, within their social networks, and where the message spreads beyond their own (social) networks to different, often distant networks, resulting in a sharp acceleration in the number of people who are exposed to the message.” (Nahon is an associate professor, and Hemsley a Ph.D. candidate, at the Information School of the University of Washington.
Here the term “information item” is used very broadly, to cover just about any packet of bytes: texts, photographs, video, sound files, etc. It also includes links taking you to such material. But unlike a computer virus -- an unwanted, often destructive such packet – a message that has “gone viral” doesn’t just forward itself. It propagates through numerous, dispersed, and repeated decisions to pay attention to something and then circulate it.
The process has a shape. Charting on a graph the number of times a message is forwarded over time, we find that the curve for a news item appearing at a site with a great deal of traffic (or a movie trailer advertised on a number of sites) shoots up at high speed, then falls just about as rapidly. The arc is rapid and smooth.
By contrast, the curve for an item going viral is a bit more drawn-out -- and a lot rougher. It may show little or no motion for a while before starting to trend upwards for a while (possibly followed by a plateau or downturn or two) until reaching a certain point at which the acceleration becomes extremely sharp, heading to a peak, whereupon the number of forwards begins to fall off, more or less rapidly -- with an occasional bounce upwards perhaps, but nothing so dramatic as before.
So the prominently featured news item or blockbuster ad campaign on YouTube shoots straight up, like a model rocket on a windless day, until the fuel (newsworthiness, dollars) runs out, whereupon it stops, then begins to accelerate in the opposite direction. But when something goes viral, more vectors are involved. It circulates within and between clusters of people -- individuals with strong mutual connections with each other. It circulates through the networks, formal or informal, in which those clusters are embedded.
And from there, onward and outward – whether with a push (when somebody with a million Twitter followers takes notice), or a pull (it begins to rank among top search-engine results on a certain topic), or both. The authors itemize factors in play in decisions about whether or not to share something: salience, emotional response, congruence with the person’s values, etc. And their definition of virality as “a social information flow process” takes into account both the horizontal dimension of exchange (material circulating spontaneously among people familiar with one another) and the roles of filtering and broadcasting exercised by individuals and online venues with a lot of social capital.
None of which makes virality something that can be planned, however. “Content that we create can remain stubbornly obscure even when we apply our best efforts to promote it,” they write. “It can also grow and spread with an apparent life and momentum of its own, destroying some people’s lives and bringing fame and fortune to others, sometimes in a matter of days.”
An Internet meme, as Limor Shifman sums things up, is “(a) a group of digital items sharing common characteristics of content, form, and/or stance; (b) that were created with awareness of each other; and (c) were circulated, imitated, and/or transformed via the Internet by many users.”
As with virality, the concept rests on a biological metaphor. Coined by Richard Dawkins in 1976, “meme” began in a quasi-scientific effort to identify the gene-like elements of behavior, cultural patterns, and belief systems that caused them to persist, expand, and reproduce themselves over very long periods of time. As reincarnated within cyberculture, the meme is a thing of slighter consequence: a matter of endless variation on extremely tenacious inside jokes, occupying and replicating within the brains of bored people in offices.
Shifman's point that memetic communication (which for the most part involves mimicry of existing digital artifacts with parodic intent and/or "remixing" them with new content) is an exemplary case of Web 2.0 culture seems to me sound, which probably also explains why much in the book may seem familiar even to someone not up on LOLcats studies. Yes, memes are a form of active participation in digital communication. Yes, they can carry content that (whether the meme goes viral or not) questions or challenges existing power structures. I have seen my share of Downfall parody videos, and am glad to know that Bruno Gantz is okay with the whole thing. But every so often that line from Thoreau comes to mind -- "as if we could kill time without injuring eternity" -- and it seems like a good idea to go off the grid for a while.
Over the last year there has been a steady stream of articles about the “crisis in the humanities,” fostering a sense that students are stampeding from liberal education toward more vocationally oriented studies. In fact, the decline in humanities enrollments, as some have pointed out, is wildly overstated, and much of that decline occurred in the 1970s and 1980s. Still, the press is filled with tales about parents riding herd on their offspring lest they be attracted to literature or history rather than to courses that teach them to develop new apps for the next, smarter phone.
America has long been ambivalent about learning for its own sake, at times investing heavily in free inquiry and lifelong learning, and at other times worrying that we need more specialized training to be economically competitive. A century ago these worries were intense, and then, as now, pundits talked about a flight from the humanities toward the hard sciences.
Liberal education was a core American value in the first half of the 20th century, but a value under enormous pressure from demographic expansion and the development of more consistent public schooling. The increase in the population considering postsecondary education was dramatic. In 1910 only 9 percent of students received a high school diploma; by 1940 it was 50 percent. For the great majority of those who went on to college, that education would be primarily vocational, whether in agriculture, business, or the mechanical arts. But even vocationally oriented programs usually included a liberal curriculum -- a curriculum that would provide an educational base on which one could continue to learn -- rather than just skills for the next job. Still, there were some then (as now) who worried that the lower classes were getting “too much education.”
Within the academy, between the World Wars, the sciences assumed greater and greater importance. Discoveries in physics, chemistry, and biology did not seem to depend on the moral, political, or cultural education of the researchers – specialization seemed to trump broad humanistic learning. These discoveries had a powerful impact on industry, the military, and health care; they created jobs! Specialized scientific research at universities produced tangible results, and its methodologies – especially rigorous experimentation – could be exported to transform private industry and the public sphere. Science was seen to be racing into the future, and some questioned whether the traditional ideas of liberal learning were merely archaic vestiges of a mode of education that should be left behind.
In reaction to this ascendancy of the sciences, many literature departments reimagined themselves as realms of value and heightened subjectivity, as opposed to so-called value-free, objective work. These “new humanists” of the 1920s portrayed the study of literature as an antidote to the spiritual vacuum left by hyperspecialization. They saw the study of literature as leading to a greater appreciation of cultural significance and a personal search for meaning, and these notions quickly spilled over into other areas of humanistic study. Historians and philosophers emphasized the synthetic dimensions of their endeavors, pointing out how they were able to bring ideas and facts together to help students create meaning. And arts instruction was reimagined as part of the development of a student’s ability to explore great works that expressed the highest values of a civilization. Artists were brought to campuses to inspire students rather than to teach them the nuances of their craft. During this interwar period a liberal education surely included the sciences, but many educators insisted that it not be reduced to them. The critical development of values and meaning was a core function of education.
Thus, despite the pressures of social change and of the compelling results of specialized scientific research, there remained strong support for the notion that liberal education and learning for its own sake were essential for an educated citizenry. And rather than restrict a nonvocational education to established elites, many saw this broad teaching as a vehicle for ensuring commonality in a country of immigrants. Free inquiry would model basic democratic values, and young people would be socialized to American civil society by learning to think for themselves.
By the 1930s, an era in which ideological indoctrination and fanaticism were recognized as antithetical to American civil society, liberal education was acclaimed as key to the development of free citizens. Totalitarian regimes embraced technological development, but they could not tolerate the free discussion that led to a critical appraisal of civic values. Here is the president of Harvard, James Bryant Conant, speaking to undergraduates just two years after Hitler had come to power in Germany:
To my mind, one of the most important aspects of a college education is that it provides a vigorous stimulus to independent thinking.... The desire to know more about the different sides of a question, a craving to understand something of the opinions of other peoples and other times mark the educated man. Education should not put the mind in a straitjacket of conventional formulas but should provide it with the nourishment on which it may unceasingly expand and grow. Think for yourselves! Absorb knowledge wherever possible and listen to the opinions of those more experienced than yourself, but don’t let any one do your thinking for you.
This was the 1930s version of liberal learning, and in it you can hear echoes of Thomas Jefferson’s idea of autonomy and Ralph Waldo Emerson’s thoughts on self-reliance.
In the interwar period the emphasis on science did not, in fact, lead to a rejection of broad humanistic education. Science was a facet of this education. Today, we must not let our embrace of STEM fields undermine our well-founded faith in the capacity of the humanities to help us resist “the straitjackets of conventional formulas.” Our independence, our freedom, has depended on not letting anyone else do our thinking for us. And that has demanded learning for its own sake; it has demanded a liberal education. It still does.
Michael Roth is president of Wesleyan University. His new book, Beyond the University: Why Liberal Education Matters, will be published next year by Yale University Press.His Twitter handle is@mroth78
In all those years I was pursuing a Ph.D. in religious studies, the question of what my profession really stood for rarely came up in conversation with fellow academics, save for occasional moments when the position of the humanities in higher education came under criticism in public discourse. When such moments passed, it was again simply assumed that anyone entering a doctoral program in the humanities knowingly signed on to a traditional career of specialized research and teaching.
But the closer I got to receiving that doctorate, the less certain I became that this was a meaningful goal. I was surrounded by undergraduates who were rich, well-meaning, and largely apathetic to what I learned and taught. I saw my teachers and peers struggle against the tide of general indifference aimed at our discipline and succumb to unhappiness or cynicism. It was heartbreaking.
Fearing that I no longer knew why I studied religion or the humanities at large, I left sunny California for a teaching job at the Asian University for Women, in Chittagong, Bangladesh. My new students came from 12 different countries, and many of them had been brought up in deeply religious households, representing nearly all traditions practiced throughout Asia. They, however, knew about religion only what they had heard from priests, monks, or imams, and did not understand what it meant to study religion from an academic point of view. And that so many of them came from disadvantaged backgrounds convinced me that this position would give me a sense of purpose.
I arrived in Bangladesh prepared to teach an introductory course on the history of Asian religions. But what was meant to be a straightforward comparison of religious traditions around the region quickly slipped from my control and morphed into a terrible mess. I remember an early lesson: When I suggested during a class on religious pilgrimage that a visit to a Muslim saint’s shrine had the potential to constitute worship, it incited a near-riot.
Several Muslim students immediately protested that I was suggesting heresy, citing a Quranic injunction that only Allah should be revered. What I had intended was to point out how similar tension existed in Buddhism over circumambulation of a stupa — an earthen mound containing the relics of an eminent religious figure — since that act could be seen as both remembrance of the deceased’s worthy deeds and veneration of the person. But instead of provoking a thoughtful discussion, my idea of comparative religious studies seemed only to strike students as blasphemous.
Even more memorable, and comical in hindsight, was being urged by the same Muslims in my class to choose one version of Islam among all its sectarian and national variations and declare it the best. Whereas Palestinians pointed to the "bad Arabic" used in the signage of one local site as evidence of Islam’s degeneration in South Asia, a Pakistani would present Afghanis as misguided believers because — she claimed—they probably never read the entire Quran. While Bangladeshis counseled me to ignore Pakistanis from the minority Ismaili sect who claim that God is accessible through all religions, Bangladeshis themselves were ridiculed by other students for not knowing whether they were Sunni or Shi’a, two main branches of Islam. In the midst of all this I thought my call to accept these various manifestations of Islam as intriguing theological propositions went unheeded.
With my early enthusiasm and amusement depleted, I was ready to declare neutral instruction of religion in Bangladesh impossible. But over the course of the semester I could discern one positive effect of our classroom exercise: students’ increasing skepticism toward received wisdom. In becoming comfortable with challenging my explanations and debating competing religious ideas, students came to perceive any view toward religion as more an argument than an indisputable fact. They no longer accepted a truth claim at face value and analyzed its underlying logic in order to evaluate the merit of the argument. They expressed confidence in the notion that a religion could be understood in multiple ways. And all the more remarkable was their implicit decision over time to position themselves as rational thinkers and to define their religions for themselves.
An illustrative encounter took place at the shrine of the city’s most prominent Muslim saint. I, being a man, was the only one among our group to be allowed into the space. My students, the keeper of the door said, could be "impure" — menstruating — and were forbidden to enter. Instead of backing down as the local custom expected, the students ganged up on the sole guard and began a lengthy exposition on the meaning of female impurity in Islam. First they argued that a woman was impure only when she was menstruating and not at other times; they then invoked Allah as the sole witness to their cyclical impurity, a fact the guard could not be privy to and thus should not be able to use against them; and finally they made the case that if other Muslim countries left it up to individual women to decide whether to visit a mosque, it was not up to a Bangladeshi guard to create a different rule concerning entry. Besieged by a half-dozen self-styled female theologians of Islam, the man cowered, and withdrew his ban.
I was incredibly, indescribably proud of them.
Equally poignant was coming face to face with a student who asked me to interpret the will of Allah. Emanating the kind of glow only the truly faithful seem to possess, she sat herself down in my office, fixed the hijab around her round alabaster face, and quietly but measuredly confessed her crime: She had taken to praying at a Hindu temple because most local mosques did not have space for women, and she was both puzzled and elated that even in a non-Islamic space she could still sense the same divine presence she had been familiar with all her life as Allah. She asked for my guidance in resolving her crisis of faith. If other Muslims knew about her routine excursions to a Hindu temple, she would be branded an apostate, but did I think that her instinct was right, and that perhaps it was possible for Allah to communicate his existence through a temple belonging to another religion?
In the privacy of my office, I felt honored by her question. I had lectured on that very topic just before this meeting, arguing that sacred space was not the monopoly of any one religion, but could be seen as a construct contingent upon the presence of several key characteristics. This simple idea, which scholars often take for granted, had struck her as a novel but convincing explanation for her visceral experience of the Islamic divine inside a Hindu holy space. Though she had come asking for my approval of her newly found conviction, it was clear that she did not need anyone’s blessing to claim redemption. Humanistic learning had already provided her with a framework under which her religious experience could be made meaningful and righteous, regardless of what others might say.
And thanks to her and other students, I could at last define my own discipline with confidence I had until then lacked: The humanities is not just about disseminating facts or teaching interpretive skills or making a living; it is about taking a very public stance that above the specifics of widely divergent human ideas exist more important, universally applicable ideals of truth and freedom. In acknowledging this I was supremely grateful for the rare privilege I enjoyed as a teacher, having heard friends and colleagues elsewhere bemoan the difficulty of finding a meaningful career as humanists in a world constantly questioning the value of our discipline. I was humbled to be able to see, by moving to Bangladesh, that humanistic learning was not as dispensable as many charge.
But before I could fully savor the discovery that what I did actually mattered, my faith in the humanities was again put to a test when a major scandal befell my institution. I knew that as a member of this community I had to critique what was happening after all my posturing before students about the importance of seeking truth. If I remained silent, it would amount to a betrayal of my students and a discredit to my recent conclusion that humanistic endeavor is meant to make us not only better thinkers, but also more empowered and virtuous human beings.
So it was all the more crushing to be told to say nothing by the people in my very profession, whose purpose I thought I had finally ascertained. In private chats my friends and mentors in academe saw only the urgent need for me to extricate myself for the sake of my career, but had little to say about how to address the situation. Several of my colleagues on the faculty, though wonderful as individuals, demurred from taking a stance for fear of being targeted by the administration for retribution or losing the professional and financial benefits they enjoyed. And the worst blow, more so than the scandal itself, was consulting the one man I respected more than anybody else, a brilliant tenured scholar who chairs his own department at a research university in North America, and receiving this one-liner:
"My advice would be to leave it alone."
It was simultaneously flummoxing and devastating to hear a humanist say that when called to think about the real-life implications of our discipline, we should resort to inaction. And soon it enraged me that the same people who decry the dismantling of traditional academe under market pressure and changing attitudes toward higher education could be so indifferent, thereby silently but surely contributing to the collapse of humanists’ already tenuous legitimacy as public intellectuals.
While my kind did nothing of consequence, it was the students — the same students whom I had once dismissed as incapable of intellectual growth — who tried to speak up at the risk of jeopardizing the only educational opportunity they had. They approached the governing boards, the administration, and the faculty to hold an official dialogue. They considered staging a street protest. And finally, they gave up and succumbed to cynicism about higher education and the world, seeing many of their professors do nothing to live by the principles taught in class, and recognizing the humanities as exquisitely crafted words utterly devoid of substance.
As my feeling about my discipline shifted from profound grief to ecstatic revelation to acute disappointment, I was able to recall a sentiment expressed by one of my professors, who himself might not remember it after all these years. Once upon a time we sat sipping espresso on a verdant lawn not far from the main library, and he mused that he never understood why young people no longer seemed to feel outrage at the sight of injustice. He is a product of a generation that once once rampaged campuses and braved oppression by the Man. On first hearing his indictment, I was embarrassed to have failed the moral standard established by the older generation of scholars like him. But now I see that it is not just young people but much of our discipline, both young and old, that at present suffers from moral inertia. With only a few exceptions, humanists I know do not consider enactment of virtue to be their primary professional objective, whether because of the more important business of knowledge production or material exigencies of life. And I can only conclude, with no small amount of sadness, that most humanists are not, nor do they care to be, exemplary human beings.
Maybe I should move on, as did a friend and former academic who believes that the only people we can trust to stand on principle are "holy men, artists, poets, and hobos," because yes, it is true that humanists should not be confused with saints. But the humanities will always appear irrelevant as long as its practitioners refrain from demonstrating a tangible link between what they preach and how they behave. In light of the current academic penchant for blaming others for undoing the humanities, it must be said that humanists as a collective should look at themselves first, and feel shame that there is so much they can say — need to say — about the world, but that they say so little at their own expense.
After a year and a half in Bangladesh, I do not doubt any longer that the humanities matters, but now I know that the discipline’s raison d’être dies at the hands of those humanists who do not deserve their name.
Se-Woong Koo earned his Ph.D. in religious studies from Stanford University in 2011. He currently serves as Rice Family Foundation Visiting Fellow and Lecturer at Yale University.
The most memorable thing about the 2002 science-fiction movie Minority Report was its depiction of advertising in a few decades -- in particular the scene of Tom Cruise hurrying through a mall, besieged by holographic, interactive invitations to have a Guinness or use American Express, and asking him how he liked the tank tops he’d purchased at the Gap. The virtual shills address him by name (the character’s name, that is) thanks to retinal scanners, which are as ubiquitous in the 2050s as surveillance cameras had become in the century’s first decade.
They are pop-up ads from hell, swarming like hungry ghosts to devour everyone’s attention. (The people Tom Cruise rushes past are presumably getting their own biometrically personalized shopping advice.) The scene feels uncomfortably plausible; it’s the experience of being on the Internet, extended into public space and rendered inescapable.
How effective the film is as social criticism probably depends on what you make of the fact that a quarter of its budget came from product placement. Minority Report’s critique of advertising turns out to be, in part, critique as advertising.
Now, I have some good news and some bad news. The good news is that people have become so resistant to hard-sell advertisement (dodging TV commercials with their DVRs, ignoring or mocking how ad agencies target their desires or insecurities) that they have lost influence. By the 2050s, our psychic calluses should be really thick.
The bad news concerns what is taking the place of the hard sell: a range of techniques discussed at some length in Your Ad Here: The Cool Sell of Guerrilla Advertising (New York University Press) by Michael Serazio, an assistant professor of communications at Fairfield University.
“Cool” advertising, as Serazio uses the expression, does not refer only to campaigns that make a product seem hip, hot, and happening -- so that you will be, too, by buying it. The adjective is instead a nod to the contrast between Marshall McLuhan’s famous if altogether dubious categorizations of “hot” media, such as film or print, and the “cool” sort, chiefly meaning television.
A hot medium, goes the theory, transmits its content in high resolution, so that the recipient easily absorbs it through a single sense. A cool medium, with its low resolution, demands greater involvement from the recipient in absorbing the message. Someone reading Aristotle or watching "Citizen Kane" is more or less passively taking in what the hot medium bombards the eye with, while the “Gilligan’s Island” audience finds its senses quickened (auditory and tactile in particular, according to McLuhan) by a need to compensate for the cool medium’s low level of visual stimulation.
That makes as much sense as any of the sage of Toronto’s other ideas, which is to say not a hell of a lot. Nonetheless, Serazio gets as much value out of the distinction as seems humanly possible by adapting it to the contrast between the old-school “hot” ad campaign – with its clear, strong message that you should buy Acme brand whatchamacallits, and here’s why – and a variety of newer, “cooler” approaches that are more seductive, self-effacing, or canny about dealing with widespread cynicism about corporate hype.
A cool ad campaign, when successful, does not simply persuade people to buy something but creates a kind of spontaneous, intimate involvement with the campaign itself. The consumer’s agency is always stressed. ("Agency" in the sense of capacity to act, rather than where "Mad Men" do their business.) The Dorito’s "Fight for the Flavor" campaign of the mid-‘00s empowered the chip-gobbling public to determine which of two new flavors, Smokin' Cheddar BBQ or Wild White Nacho, would remain on the shelves and which would be pulled. Bloggers and tweeters are encouraged to express their authentic, unscripted enthusiasm. “Buzz agents” are given free samples of a product, chat it up with their friends, then report back how the discussions went. (With word-of-mouth campaigns, the most important is authenticity. Fake that and you’ve got it made.)
And at perhaps its most sophisticated level, cool advertising will cultivate the (potential) consumer’s involvement almost as an end in itself – for example, by providing an opportunity to control the behavior of a man in a chicken suit known as Subservient Chicken.
Let us return to the horrible fascination of Subservient Chicken in due course. But first, theory.
Foucault plus Gramsci equals about a third of the stuff published in cultural studies -- of which “critical industry media studies,” the subspecialty into which Serazio’s book falls, is a part. The conceptual work in Your Ad Here is done with Foucault’s line of power tools, in particular his considerations on governance, while Gramsci seems along mostly to keep him company.
Advertising as governance sounds counterintuitive, given the connotation of state power it elicits, but in Foucault’s work “government” refers to processes of guidance and control that may be more or less distant from the state’s institutions. The teacher governs a class (or tries) and a boss governs the workplace.
Over all, “management” seems like a more suitable term for most non-state modes of governance, and it has the advantage of foregrounding what Serazio wants to stress: Foucault’s point is that governance doesn’t mean giving orders and enforcing obedience but rather “structuring the possible field of action of others” in order “to arrange things in such a way that, through a certain number of means, such-and-such ends may be achieved.”
Governance (management) in this sense is a kind of effective persuasion of the governed party (the student, the fry cook, etc.) to exercise his or her agency to perform the necessary functions of the institution (school, fast-food place) without being subjected to constant external pressure. Insofar as governance is an art or a science, it is through recognizing and anticipating resistance, and preventing or containing disruption. (Some remarks by Gramsci on hegemony and resistance also apply here, but really just barely.)
“Cool sell” advertising counts as governance, in Serazio’s book, because it tries to neutralize public fatigue from advertisement overload -- so that we’re still incited to spend money and think well of a brand. That’s the common denominator of viral marketing, crowdsourced publicity campaigns, plebiscites on snack-food availability, and so on.
It occasionally sounds like Serazio is criticizing these methods as manipulative, but I suspect that’s actually high praise, like when one horror fan tells another that a torture scene in "Hostel" gave him nightmares.
Which brings us back, as promised, to Subservient Chicken, whose role in promoting the Burger King menu remains oblique at best. But he undeniably garnered an enormous amount of attention -- 20 million distinct viewers generating half a billion hits. “By filming hundreds of video clips of a man in a chicken suit,” the author says, “and writing code for a database of terms that would respond to keyword commands for the Chicken to perform those videotaped actions, [the advertising agency] concocted something that was, its own words, ‘so creepy, weird and well-executed that many people who visited… thought they were actually controlling this person in a chicken suit in real life.’ ” I can’t help feeling this calls for more extensive Foucauldian analysis, but I won’t be sticking around to see how that turns out.
"Mad Men" returns to cable television this coming Sunday, continuing its saga of mutable identities and creative branding at a New York advertising firm during the 1960s. Or at least one assumes it will still be set in the ‘60s. How much narrative time lapses between seasons varies unpredictably. Like everything else about the show, it remains the network’s closely guarded secret. Critics given an early look at the program must agree to an embargo on anything they publish about it. This makes perfect sense in the context of the social world of "Mad Men" itself: the network is, after all, selling the audience’s curiosity to advertisers.
A different economy of attention operates in Mad Men, Mad World: Sex, Politics, Style & the 1960s, a collection of 18 essays on the program just published by Duke University Press. It’s not just a matter of the editors and contributors all being academics, hence presumably a different sort of cultural consumer from that of the average viewer. On the contrary, I think that is exactly wrong. Serialized narrative has to generate in its audience the desire for an answer to a single, crucial question: “And then what happens?” (Think of all the readers gathered at the docks in New York to get the latest installment of a Dickens novel coming from London.)
Of course, the contributors to Mad Men, Mad World write with a host of more complex questions in mind, but I don’t doubt for a second that many of the papers were initially inspired by weekend-long diegetic binge sessions, fueled by the same desire driving other viewers. At the same time, there’s every reason to think that the wider public is just as interested in the complex questions raised by the show as any of the professors writing about it. For they are questions are about race, class, gender, sexuality, politics, money, happiness, misery, and lifestyle – and about how much any configuration of these things can change, or fail to change, over time.
Many of the essays serve as replies to a backlash against "Mad Men" that began in the third or fourth season, circa 2009, as it was beginning to draw a much larger audience than it had until that point. The complaint was that the show, despite its fanatical attention to the style, dress, and décor of the period, was simple-mindedly 21st century in its attitude toward the characters. It showed a world in which blunt expressions of racism, misogyny, and homophobia were normal, and sexual harassment in the workplace was an executive perk. Men wore hats and women stayed home. Everyone smoked like a chimney and drank like a fish, often at the same time. Child abuse was casual. So was littering.
And because all of it was presented in tones by turn ironic and horrified, viewers were implicitly invited to congratulate themselves on how enlightened they were now. Another criticism held that "Mad Men" only seemed to criticize the oppressive arrangements it portrayed, while in reality allowing the viewer to enjoy them vicariously. These complaints sound contradictory: the show either moralistically condemns its characters or inspires the audience to wallow in political incorrectness. But they aren’t mutually exclusive by any means. What E.P. Thompson called “the enormous condescension of posterity” tends to be a default setting with Americans, alternating with periods of maudlin nostalgia. There’s no reason the audience couldn’t feel both about the "Mad Men" vision of the past.
See also a comment by the late Christopher Lasch, some 20 years ago: “Nostalgia is superficially loving in its re-creation of the past, but it invokes the past only to bury it alive. It shares with the belief in progress, to which it is only superficially opposed, an eagerness to proclaim the death of the past and to deny history’s hold on the present.”
At the risk of conflating too many arguments under too narrow a heading, I’d say that the contributors to Mad Men, Mad World agree with Lasch’s assessment of progress and nostalgia while also demonstrating how little it applies to the program as a whole.
Caroline Levine’s essay “The Shock of the Banal: Mad Men's Progressive Realism” provides an especially apt description of how the show works to create a distinct relationship between past and present that’s neither simply nostalgic nor a celebration of how far we’ve come. The dynamic of "Mad Men" is, in her terms, “the play of familiarity in strangeness” that comes from seeing “our everyday assumptions just far enough removed from us to feel distant.” (Levine is a professor of English at the University of Wisconsin at Madison.)
The infamous Draper family picnic in season two is a case in point. After a pleasant afternoon with the kids in a bucolic setting, the parents pack up their gear, shake all the garbage off their picnic blanket, and drive off. The scene is funny, in the way appalling behavior can sometimes be, but it’s also disturbing. The actions are so natural and careless – so thoughtless, all across the board – that you recognize them immediately as habit. Today’s viewers might congratulate themselves for at least feeling guilty when they litter. But that’s not the only possible response, because the scene creates an uneasy awareness that once-familiar, “normal” ideas and actions came to be completely unacceptable – within, in fact, a relatively short time. It eventually became the butt of jokes, but the famous “Keep America Beautiful” ad from about 1970 -- the one with the crying Indian -- probably had a lot to do with it. (Such is the power of advertising.)
The show's handling of race and gender can be intriguing and frustrating. All the powerful people in it are straight white guys in ties, sublimely oblivious to even the possibility that their word might not be law. "Mad Space" by Dianne Harris, a professor of architecture and art history at the University of Illinois at Urbana-Champaign, offers a useful cognitive map of the show's world -- highlighting how the advertising firm's offices are organized to demonstrate and reinforce the power of the executives over access to the female employees' labor (and, often enough, bodies), while the staid home that Don Draper and his family occupy in the suburbs is tightly linked to the upper-middle-class WASP identity he is trying to create for himself by concealing and obliterating his rural, "white trash" origins. A handful of African-American characters appear on the margins of various storylines -- and one, the Drapers' housekeeper Carla, occupies the especially complex and fraught position best summed up in the phrase "almost part of the family." But we never see the private lives of any nonwhite character.
In "Representing the Mad Margins of the Early 1960s: Northern Civil Rights and the Blues Idiom," Clarence Lang, an associate professor of African and African-American studies at the University of Kansas, writes that "Mad Men" "indulges in a selective forgetfulness" by "presuming a black Northern quietude that did not exist" (in contrast to the show's occasional references to the civil rights movement below the Mason-Dixon line). Lang's judgment here is valid -- up to a point. As it happens, all of the essays in the collection were written before the start of the fifth season, in which black activists demonstrate outside the firm's building to protest the lack of job opportunities. Sterling Cooper Draper Pryce hires its first African-American employee, a secretary named Dawn. I think a compelling reading of"Mad Men"would recognize that the pace and extent of the appearance of nonwhite characters on screen is a matter not of the creators' refusal to portray them, but of their slow arrival on the scene of an incredibly exclusionary social world being transformed (gradually and never thoroughly) by the times in which "Mad Men" is set.
There is much else in the book that I found interesting and useful in thinking about "Mad Men," and I think it will be stimulating to readers outside the ranks of aca fandom. I’ll return to it in a few weeks, with an eye to connecting some of the essays to new developments at Sterling Cooper Draper Pryce. (Presumably the firm will have changed its name in the new season, given the tragic aftermath of Lane Pryce’s venture in creative bookkeeping.)
When things left off, it was the summer of 1967. I have no better idea than any one else when or how the narrative will pick up, but really hope that Don Draper creates the ad campaign for Richard Nixon.