Cultural studies

review of 'Mad Men, Mad World: Sex, Politics, Style & the 1960s'

"Mad Men" returns to cable television this coming Sunday, continuing its saga of mutable identities and creative branding at a New York advertising firm during the 1960s. Or at least one assumes it will still be set in the ‘60s. How much narrative time lapses between seasons varies unpredictably. Like everything else about the show, it remains the network’s closely guarded secret. Critics given an early look at the program must agree to an embargo on anything they publish about it. This makes perfect sense in the context of the social world of "Mad Men" itself: the network is, after all, selling the audience’s curiosity to advertisers.

A different economy of attention operates in Mad Men, Mad World: Sex, Politics, Style & the 1960s, a collection of 18 essays on the program just published by Duke University Press. It’s not just a matter of the editors and contributors all being academics, hence presumably a different sort of cultural consumer from that of the average viewer. On the contrary, I think that is exactly wrong. Serialized narrative has to generate in its audience the desire for an answer to a single, crucial question: “And then what happens?” (Think of all the readers gathered at the docks in New York to get the latest installment of a Dickens novel coming from London.)

Of course, the contributors to Mad Men, Mad World write with a host of more complex questions in mind, but I don’t doubt for a second that many of the papers were initially inspired by weekend-long diegetic binge sessions, fueled by the same desire driving other viewers. At the same time, there’s every reason to think that the wider public is just as interested in the complex questions raised by the show as any of the professors writing about it. For they are questions are about race, class, gender, sexuality, politics, money, happiness, misery, and lifestyle – and about how much any configuration of these things can change, or fail to change, over time.   

Many of the essays serve as replies to a backlash against "Mad Men" that began in the third or fourth season, circa 2009, as it was beginning to draw a much larger audience than it had until that point. The complaint was that the show, despite its fanatical attention to the style, dress, and décor of the period, was simple-mindedly 21st century in its attitude toward the characters. It showed a world in which blunt expressions of racism, misogyny, and homophobia were normal, and sexual harassment in the workplace was an executive perk. Men wore hats and women stayed home.  Everyone smoked like a chimney and drank like a fish, often at the same time. Child abuse was casual. So was littering.

And because all of it was presented in tones by turn ironic and horrified, viewers were implicitly invited to congratulate themselves on how enlightened they were now. Another criticism held that "Mad Men" only seemed to criticize the oppressive arrangements it portrayed, while in reality allowing the viewer to enjoy them vicariously. These complaints sound contradictory: the show either moralistically condemns its characters or inspires the audience to wallow in political incorrectness. But they aren’t mutually exclusive by any means. What E.P. Thompson called “the enormous condescension of posterity” tends to be a default setting with Americans, alternating with periods of maudlin nostalgia. There’s no reason the audience couldn’t feel both about the "Mad Men" vision of the past.

See also a comment by the late Christopher Lasch, some 20 years ago: “Nostalgia is superficially loving in its re-creation of the past, but it invokes the past only to bury it alive. It shares with the belief in progress, to which it is only superficially opposed, an eagerness to proclaim the death of the past and to deny history’s hold on the present.”

At the risk of conflating too many arguments under too narrow a heading, I’d say that the contributors to Mad Men, Mad World agree with Lasch’s assessment of progress and nostalgia while also demonstrating how little it applies to the program as a whole.

Caroline Levine’s essay “The Shock of the Banal: Mad Men's Progressive Realism” provides an especially apt description of how the show works to create a distinct relationship between past and present that’s neither simply nostalgic nor a celebration of how far we’ve come. The dynamic of "Mad Men" is, in her terms, “the play of familiarity in strangeness” that comes from seeing “our everyday assumptions just far enough removed from us to feel distant.” (Levine is a professor of English at the University of Wisconsin at Madison.)

The infamous Draper family picnic in season two is a case in point. After a pleasant afternoon with the kids in a bucolic setting, the parents pack up their gear, shake all the garbage off their picnic blanket, and drive off. The scene is funny, in the way appalling behavior can sometimes be, but it’s also disturbing. The actions are so natural and careless – so thoughtless, all across the board – that you recognize them immediately as habit. Today’s viewers might congratulate themselves for at least feeling guilty when they litter. But that’s not the only possible response, because the scene creates an uneasy awareness that once-familiar, “normal” ideas and actions came to be completely unacceptable – within, in fact, a relatively short time. It eventually became the butt of jokes, but the famous “Keep America Beautiful” ad from about 1970 -- the one with the crying Indian -- probably had a lot to do with it. (Such is the power of advertising.)

The show's handling of race and gender can be intriguing and frustrating. All the powerful people in it are straight white guys in ties, sublimely oblivious to even the possibility that their word might not be law. "Mad Space" by Dianne Harris, a professor of architecture and art history at the University of Illinois at Urbana-Champaign, offers a useful cognitive map of the show's world -- highlighting how the advertising firm's offices are organized to demonstrate and reinforce the power of the executives over access to the female employees' labor (and, often enough, bodies), while the staid home that Don Draper and his family occupy in the suburbs is tightly linked to the upper-middle-class WASP identity he is trying to create for himself by concealing and obliterating his rural, "white trash" origins. A handful of African-American characters appear on the margins of various storylines -- and one, the Drapers' housekeeper Carla, occupies the especially complex and fraught position best summed up in the phrase "almost part of the family." But we never see the private lives of any nonwhite character.

In "Representing the Mad Margins of the Early 1960s: Northern Civil Rights and the Blues Idiom," Clarence Lang, an associate professor of African and African-American studies at the University of Kansas, writes that "Mad Men" "indulges in a selective forgetfulness" by "presuming a black Northern quietude that did not exist" (in contrast to the show's occasional references to the civil rights movement below the Mason-Dixon line). Lang's judgment here is valid -- up to a point. As it happens, all of the essays in the collection were written before the start of the fifth season, in which black activists demonstrate outside the firm's building to protest the lack of job opportunities. Sterling Cooper Draper Pryce hires its first African-American employee, a secretary named Dawn. I think a compelling reading of "Mad Men" would recognize that the pace and extent of the appearance of nonwhite characters on screen is a matter not of the creators' refusal to portray them, but of their slow arrival on the scene of an incredibly exclusionary social world being transformed (gradually and never thoroughly) by the times in which "Mad Men" is set.   

There is much else in the book that I found interesting and useful in thinking about "Mad Men," and I think it will be stimulating to readers outside the ranks of aca fandom. I’ll return to it in a few weeks, with an eye to connecting some of the essays to new developments at Sterling Cooper Draper Pryce. (Presumably the firm will have changed its name in the new season, given the tragic aftermath of Lane Pryce’s venture in creative bookkeeping.)

When things left off, it was the summer of 1967. I have no better idea than any one else when or how the narrative will pick up, but really hope that Don Draper creates the ad campaign for Richard Nixon.

 

Editorial Tags: 

NHA speakers implore humanities scholars to fight for their fields

Smart Title: 

Advocates for the humanities search for the arguments to win federal support, and to stop having their disciplines treated "like a piñata."

Putting the black studies debate into perspective (essay)

Intellectual Affairs

For a week now, friends have been sending me links from a heated exchange over the status and value of black studies. It started among bloggers, then spilled over into Twitter, which always makes things better. I'm not going to rehash the debate, which, after all, is always the same. As with any other field, black studies (or African-American studies, or, in the most cosmopolitan variant, Africana studies) could only benefit from serious, tough-minded, and ruthlessly intelligent critique. I would be glad to live to see that happen.

But maybe the rancor will create some new readers for a book published five years ago, From Black Power to Black Studies: How a Radical Social Movement Became an Academic Discipline (Johns Hopkins University Press) by Fabio Rojas, an associate professor of sociology at Indiana University. Someone glancing at the cover in a bookstore might take the subtitle to mean it's another one of those denunciations of academia as a vast liberal-fascist indoctrination camp for recruits to the New World Order Gestapo. I don't know whether that was the sales department's idea; if so, it was worth a shot. Anyway, there the resemblance ends. Rojas wrote an intelligent, informed treatment of black studies, looking at it through the lens of sociological analysis of organizational development, and with luck the anti-black-studies diatribalists will read it by mistake and accidentally learn something about the field they are so keen to destroy. (Spell-check insists that “diatribalists” is not a word, but it ought to be.)

Black studies was undeniably a product of radical activism in the late 1960s and early ‘70s. Administrators established courses only as a concession to student protesters who had a strongly politicized notion of the field’s purpose. “From 1969 to 1974,” Rojas writes, “approximately 120 degree programs were created,” along with “dozens of other black studies units, such as research centers and nondegree programs,” plus professional organizations and journals devoted to the field.

But to regard black studies as a matter of academe becoming politicized (as though the earlier state of comprehensive neglect wasn’t politicized) misses the other side of the process: “The growth of black studies,” Rojas suggests, “can be fruitfully viewed as a bureaucratic response to a social movement.” By the late 1970s, the African-American sociologist St. Clair Drake (co-author of Black Metropolis, a classic study of Chicago to which Richard Wright contributed an introduction) was writing that black studies had become institutionalized “in the sense that it had moved from the conflict phase into adjustment to the existing educational system, with some of its values accepted by that system…. A trade-off was involved. Black studies became depoliticized and deradicalized.”

That, too, is something of an overstatement -- but it is far closer to the truth than denunciations of black-studies programs, which treat them as politically volatile, yet also as well-entrenched bastions of power and privilege. As of 2007, only about 9 percent of four-year colleges and universities had a black studies unit, few of them with a graduate program. Rojas estimates that “the average black studies program employs only seven professors, many of whom are courtesy or joint appointments with limited involvement in the program” -- while in some cases a program is run by “a single professor who organizes cross-listed courses taught by professors with appointments in other departments.”

The field “has extremely porous boundaries,” with scholars who have been trained in fields “from history to religious studies to food science.” Rojas found from a survey that 88 percent of black studies instructors had doctoral degrees. Those who didn’t “are often writers, artists, and musicians who have secured a position teaching their art within a department of black studies.”

As for faculty working primarily or exclusively in black studies, Rojas writes that “the entire population of tenured and tenure-track black studies professors -- 855 individuals -- is smaller than the full-time faculty of my own institution.” In short, black studies is both a small part of higher education in the United States and a field connected by countless threads to other forms of scholarship. The impetus for its creation came from African-American social and political movements. But its continued existence and development has meant adaptation to, and hybridization with, modes of enquiry from long-established disciplines.

Such interdisciplinary research and teaching is necessary and justified because (what I am about to say will be very bold and very controversial, and you may wish to sit down before reading further) it is impossible to understand American life, or modernity itself, without a deep engagement with African-American history, music, literature, institutions, folklore, political movements, etc.

In a nice bit of paradox, that is why C.L.R. James was so dubious about black studies when it began in the 1960s. As author of The Black Jacobins and The History of Negro Revolt, among other classic works, he was one of the figures students wanted to be made visiting professor when they demanded black studies courses. But when he accepted, it was only with ambivalence. "I do not believe that there is any such thing as black studies," he told an audience in 1969. "...I only know, the struggle of people against tyranny and oppression in a certain social setting, and, particularly, the last two hundred years. It's impossible for me to separate black studies and white studies in any theoretical point of view."

Clearly James's perspective has nothing in common with the usual denunciations of the field. The notion that black studies is just some kind of reverse-racist victimology, rigged up to provide employment for "kill whitey" demagogues, is the product of malice. But it also expresses a certain banality of mind -- not an inability to learn, but a refusal to do so. For some people, pride in knowing nothing about a subject will always suffice as proof that it must be worthless.

Review of Orin Starn, "The Passion of Tiger Woods"

Intellectual Affairs

On the Friday following Thanksgiving in 2009, Tiger Woods had an automobile accident. For someone who does not follow golf, the headlines that ran that weekend provided exactly as much information as it seemed necessary to have. Over the following week, I noticed a few more headlines, but they made no impression. Some part of the brain is charged with the task of filtering the torrent of signals that bombard it from the media every day. And it did its job with reasonable efficiency, at least for a while.

Some sort of frenzy was underway. It became impossible to tune this out entirely. I began to ignore it in a more deliberate way. (All due respect to the man for his talent and accomplishments, but the doings of Tiger Woods were exactly as interesting to me as mine would be to him.) There should be a word for the effort to avoid giving any attention to some kerfuffle underway in the media environment. “Fortified indifference,” perhaps. It’s like gritting your teeth, except with neurons.

But the important thing about my struggle in 2009 is that it failed. Within six weeks of the accident, I had a rough sense of the whole drama in spite of having never read a single article on the scandal, nor watched nor listened to any news broadcasts about it. The jokes, allusions, and analogies spinning off from the event made certain details inescapable. A kind of cultural saturation had occurred. Resistance was futile. The whole experience was irritating, even a little depressing, for it revealed the limits of personal autonomy in the face of an unrelenting media system, capable of imposing utterly meaningless crap on everybody’s attention, one way or another.

But perhaps that’s looking at things the wrong way. Consider the perspective offered by Orin Starn in The Passion of Tiger Woods: An Anthropologist Reports on Golf, Race, and Celebrity Scandal (Duke University Press). Starn, the chair of cultural anthropology at Duke, maintains that the events of two years back were not meaningless at all. If anything, they were supercharged with cultural significance.

The book's title alludes to the theatrical reenactments of Christ’s suffering performed at Easter during the middle ages, or at least to Mel Gibson’s big-screen rendition thereof. Starn interprets “Tigergate” as an early 21st-century version of the scapegoating rituals analyzed by René Girard. From what I recall of Girardian theory, the reconsolidation of social order involves the scapegoat being slaughtered, rather than paying alimony, though in some cases that may be too fine a distinction.

The scandal was certainly louder and more frenetic than the game that Woods seems have been destined to master. The first image of him in the book shows him at the age of two, appearing on "The Mike Douglas Show" with his father. He is dressed in pint-sized golfing garb, with a little bag of clubs over his shoulder. As with a very young Michael Jackson, the performance of cuteness now reads as a bit creepy. Starn does not make the comparison, but it’s implicit, given the outcome. “This toddler was not to be one of those child prodigies who flames out under unbearable expectations,” Starn writes. “By his early thirties, he was a one-man multinational company…. Forbes magazine heralded Woods as the first athlete to earn $1 billion.”

Starn, who mentions that he is a golfer, is also a scholar of the game, which he says “has always traced the fault lines of conflict, hierarchy, and tension in America, among them the archetypal divides of race and class.” To judge by my friend Dave Zirin’s book A People’s History of Sports in the United States (The New Press) that’s true of almost any athletic pursuit, even bowling. But the salient point about Woods is that most of his career has been conducted as if no such fault lines existed. Starn presents some interesting and little-known information on how golf was integrated. But apart from his genius on the green, Woods’s “brand” has been defined by its promise of harmony: “He and his blonde-haired, blue-eyed wife, Elin Nordegren, seemed the poster couple for a shiny new postracial America with their two young children, two dogs, and the fabulous riches of Tiger’s golfing empire.”

Each of his parents had a multiracial background -- black, white, and Native American on his father’s side; Chinese, Thai, and Dutch on his mother’s. “Cablinasian,” the label Woods made up to name his blended identity, is tongue-in-cheek, but it also represents a very American tendency to mess with the established categories of racial identity by creating an ironic mask. (Ralph Ellison wrote about in his essay “Change the Joke and Slip the Yoke.”)

But that mask flew off, so to speak, when his car hit the fire hydrant in late 2009. Starn fills out his chronicle of the scandal that followed with an examination of the conversation and vituperation that took place online, often in the comments sections of news articles -- with numerous representative samples, in all their epithet-spewing, semiliterate glory. The one-drop rule remains in full effect, it seems, even for Cablinasians.

“For all the ostensible variety of opinion,” Stern writes about the cyberchatter, “there was something limited and predictable about the complaints, stereotypes, and arguments and counterarguments, as if we were watching a movie we’d already seen many times before. Whether [coming from] the black woman aggrieved with Tiger about being with white women or the white man bitter about supposed black privilege, we already knew the lines, or at least most of them.… We are all players, like it or not, in a modern American kabuki theater of race, where our masks too often seem to be frozen into a limited set of expressions.”

Same as it ever was, then. But this is where the comparison to a scapegoating ritual falls apart. (Not that it’s developed very much in any case.) At least in Girard’s analysis, the ritual is an effort to channel and purge the conflicts within a society – reducing its tensions, restoring its sense of cohesion and unity, displacing the potential for violence by administering a homeopathic dose of it. Nothing like that can be said to have happened with Tigergate. It involved no catharsis. For that matter, it ran -- by Starn’s own account -- in exactly the opposite direction: the golfer himself symbolized harmony and success and the vision of historical violence transcended with all the sublime perfection of a hole-in-one. The furor of late 2009 negated all of that. The roar was so load that it couldn’t be ignored, even if you plugged your ears and looked away.

The latest headlines indicate that Tiger Woods is going to play the Pebble Beach Pro-Am tournament next month, for the first time in a decade. Meanwhile, his ex-wife has purchased a mansion for $12 million and is going to tear it down. She is doing so because of termites, or so go the reports. Hard to tell what symbolic significance that may have. But under the circumstances, wiping out termites might not be her primary motivation for destroying something incredibly expensive.

Ryan Gosling pick-up line meme reaches academe

Section: 
Smart Title: 

Satirical blogs explore whether a Hollywood sex symbol can make academic pick-up lines seem smooth.

Not So Foreign Languages

Smart Title: 
Citing demographic and pedagogic trends, growing number of colleges rename departments "world" or "modern" languages.

Approach and Avoid

In 1939, the French anthropologist Michel Leiris published a memoir called Manhood in which he undertook an inventory of his own failures, incapacities, physical defects, bad habits, and psychosexual quirks. It is a triumph of abject self-consciousness. And the subtitle, “A Journey from Childhood into the Fierce Order of Virility,” seems to heighten the cruelty of the author’s self-mockery. Leiris portrays himself as a wretched specimen: machismo’s negation.

But in fact the title was not ironic, or at least not merely ironic. It was a claim to victory. “Whoever despises himself, still respects himself as one who despises,” as Nietzsche put it. In an essay Leiris wrote when the book was reissued after World War II, he described it as an effort to turn writing into a sort of bullfight: “To expose certain obsessions of an emotional or sexual nature, to admit publicly to certain deficiencies or dismays was, for the author, the means – crude, no doubt, but which he entrusts to others, hoping to see it improved – of introducing even the shadow of the bull’s horn into a literary work.”

By that standard, Leiris made the most broodingly taciturn character in Hemingway look like a total wuss.

The comment about passing along a technique to others -- “hoping to see it improved” -- now seems cringe-making in its own way. Leiris was addressing a small audience consisting mainly of other writers. The prospect of reality TV, online confessionals, or the industrialized production of memoirs would never have crossed his mind. He hoped his literary method -- a kind of systematic violation of the author's own privacy -- would develop as others experimented with it. Instead, the delivery systems have improved. They form part of the landscape Wayne Koestenbaum surveys in Humiliation, the latest volume in Picador’s Big Ideas/Small Books series.

Koestenbaum, a poet and essayist, is a professor of English at the City University of New York Graduate Center and a visiting professor in the painting department of the Yale School of Art. The book is an assemblage of aphoristic fragments, notes on American popular culture and its cult of celebrity, and reflections on the psychological and social dynamics of humiliation – with a few glances at how writing, or even language itself, can expose the self to disgrace. It’s unsystematic, but in a good way. Just because the author never quotes Erving Goffman or William Ian Miller is no reason to think they aren’t on his mind. “I’m writing this book,” he says early on, “in order to figure out – for my own life’s sake – why humiliation is, for me, an engine, a catalyst, a cautionary tale, a numinous scene, producing sparks and showers…. Any topic, however distressing, can become an intellectual romance. Gradually approach it. Back away. Tentatively return.”

The experience of humiliation is inevitable, short of a life spent in solitary confinement, and I suppose everyone ends up dishing it out as well as taking it, sooner or later. But that does not make the topic universally interesting. The idea of reading (let alone writing) almost two hundred pages on the subject will strike many people as strange or revolting. William James distinguished between “healthy mindedness” (the temperament inclined to “settl[ing] scores with the more evil aspects of the universe by systematically declining to lay them to heart or make much of them…. or even, on occasion, by denying outright that they exist”) and “sick souls” (which “cannot so swiftly throw off the burden of the consciousness of evil, but are congenitally fated to suffer from its presence”). Koestenbaum’s readers are going to come from just one side of that divide.

But then, one of the James’s points is that the sick soul tends to see things more clearly than the robust cluelessness of the healthy-minded ever permits. As a gay writer -- and one who, moreover, was taken to be a girl when he was young, and told that he looked like Woody Allen as an adult -- Koestenbaum has a kind of sonar for detecting plumes of humiliation beneath the surface of ordinary life.

He coins an expression to name “the somberness, or deadness, that appears on the human face when it has ceased to entertain the possibility that another person exists.” He calls it the Jim Crow gaze – the look in the eyes of a lynching party in group photos from the early 20th century, for example. But racial hatred is secondary to “the willingness to desubjectify the other person” – or, as Koestenbaum puts it more sharply, “to treat someone else as garbage.” What makes this gaze especially horrific is that the person wearing it can also be smiling. (The soldier giving her thumbs-up gesture while standing next to naked, hooded prisoners at Abu Ghraib.) The smile “attests to deadness ... you are humiliated by the refusal, evident in the aggressor’s eyes, to see you as sympathetic, to see you as a worthy, equal subject.”

Deliberate and violent degradation is the extreme case. But the dead-eyed look, the smirk of contempt, are common enough to make humiliation a kind of background radiation of everyday social existence, and intensified through digital communication “by virtue of its impersonality…its stealth attack.” An embarrassing moment in private becomes a humiliating experience forever if it goes viral on YouTube.

“The Internet is the highway of humiliation,” Koestenbaum writes. “Its purpose is to humiliate time, to turn information (and the pursuit of information) into humiliation.” This seems overstated, but true. The thought of Google owning everyone’s search histories is deeply unsettling. The sense of privacy may die off completely one day, but for now the mass media, and reality TV most of all, work to document its final twitches of agony. “Many forms of entertainment harbor this ungenerous wish: to humiliate the audience and to humiliate the performer, all of us lowered into the same (supposedly pleasurable) mosh pit.”

A study of humiliation containing no element of confession would be a nerveless book indeed. Koestenbaum is, like Leiris, a brave writer. The autobiographical portions of the book are unflinching, though flinch-inducing. There are certain pages here that, once read, cannot be unread, including one that involves amputee porn. No disrespect to amputees intended, and the human capacity to eroticize is probably boundless; but Koesternbaum's describes a practice it never would have occurred to me as being possible. With hindsight, I was completely O.K. with that, but it's too late to get the image out of my head now.

Humiliation counts on “shame’s power to undo boundaries between individuals,” which is also something creativity does. That phrase comes from Koestenbaum tribute to the late Eve Kosofsky Sedgwick towards the end of the book. He evokes the memory of her friendship at least as much as the importance of her foundational work in queer theory – though on reflection, I’m not so sure it makes sense to counterpose them. Kosofsky’s ideas permeate the book; she was, like Koestenbaum, also a poet; and Humiliation may owe something to A Dialogue on Love, the most intimate of her writings.

But it’s more reckless and disturbing, because the author plays off of his audience's own recollections of humiliation, and even with the reader's capacity for disgust. There’s a kind of crazy grace to Koestenbaum’s writing. He moves like a matador working the bull into ever greater rage -- then stepping out of the path of danger in the shortest possible distance, at the last possible moment, with a flourish.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

It's a Jersey Thing

No one would think of the call for papers as a literary genre. But the CFP can be distinguished from the usual run of academic memoranda by its appeal to the reader’s curiosity, ambition, and capacity to daydream -- and occasionally by its test of one’s power to suspend disbelief.

A few days ago, I came across the Facebook page for the University of Chicago Conference on Jersey Shore Studies. It appealed for abstracts of 500 to 600 words for “the first conference to interrogate the landmark MTV reality television show ‘Jersey Shore,’ ” to be held in October.

The program, which debuted in late 2009, follows one of the standard templates of reality TV, “young people living in a group house.” Video cameras document the usual inebriation, hot-tub sex, personal conflicts, and arias of bleepable language. What sets the show apart, I understand, is its exploration of “the guido lifestyle,” in which hair gel and year-round full-body tanning play an important part. Female guidos call themselves “guidettes.” The National Italian-American Foundation is not amused, not one little bit. Be that as it may, “Jersey Shore” is MTV’s highest rated show. Its fourth season begins in August.

“The fact that this conference is occurring may very well be a sign of the downfall of Western civilization,” said one Facebook commentator. Another just wrote, “oh dear god why.” Then again, 706 users have indicated that they plan to attend. A Facebook commitment is not one of society’s stronger bonds; still, this suggests rather more visibility than most academic conferences receive. And at least three people have chimed to say that they were already engaged in "Jersey Shore" scholarship and are glad to know about the conference. Clearly the field is making great strides.

The idea of a conference on "Jersey Shore" being held at the very institution where Alan Bloom wrote The Closing of the American Mind seems just a little too good to be true. (See also Jurgen Habermas’s Twitter account.) To find out how serious the whole thing might be, I got in touch with David Showalter, whose email address appeared on the CFP.

We spoke by phone. The short answer is, perfectly serious. Showalter has just finished his junior year as an undergraduate in the tutorial studies program, which is described by the University of Chicago as “an alternative for students who propose a coherent course of studies that clearly will not fit within a regular major.” When he came up with the idea for the conference about year ago, he says, friends thought he was joking or being eccentric. But he has received $3,000 in funding, and has received about 10 abstracts so far.

Before anyone gets too excited, let me make clear that Showalter’s pursuit of “a coherent course of studies that clearly will not fit within a regular major” does not mean that the University of Chicago is giving him credit for watching MTV.

“I don't study popular culture in my normal academic program,” Showalter told me. “My course is on issues of crime and punishment, particularly criminal law surrounding vice activities and sex offenses. I've come to an awareness of the literature on reality television almost wholly through my fascination with 'Jersey Shore' and the books I've found in the University of Chicago library system and through interlibrary loan. So I can't claim any sort of authoritative knowledge about the state of the discipline of television studies, or any expertise on the existing literature.”

Please note the earnestness. Before saying anything more about the conference, or about "Jersey Shore" itself for that matter, it bears stressing that at no point in our exchanges by phone or e-mail did Showalter seem to manifest any of the so-called “pop culture irony” that has become such a prevalent mode of self-protecting self-constitution in an era of almost unbearably dense mass-media saturation. It comes in many finely graded variants. And after 20 years of it, all of them make me tired. Showalter enjoys the show and wants to think about it -- he doesn’t merely “enjoy” the show and want to “think” about it.

Demurrals notwithstanding, Showalter quickly shows an extensive familiarity with the media-studies and social-science literature on reality television. "Many criticize 'Teen Mom' (another MTV show) for glamorizing teenage motherhood," he notes in an e-mail message, "and thereby encouraging teenagers to become pregnant. But a report by the Public Religion Research Institute claims that people who watch shows like 'Teen Mom' are actually more supportive of abortion rights and believe abortion to be morally acceptable at higher rates than non-viewers. The relationship between reality television and its viewers is much more complicated than simple approbation of the content of the shows, and so viewer response data can be quite useful in adding nuance to that picture."

Now, to be honest, I had never even heard of "Teen Mom," let alone considered its social impact. But somebody needs to do it. The possibility that "Jersey Shore" merits careful thought seems rather counterintuitive, but Showalter is clearly someone to make the case. His conference will be serious, not a festival of agnostic hipness.

But what is there to be serious about? It turns out that a few sprouts of "Jersey Shore" studies had already appeared before Showalter first circulated his CFP. The earliest entry in some future bibliography of the field will probably be “Sailing Away from The Jersey Shore: Ethnic Nullification and Sights of the Italian American Female Body from Connie Francis to Lady Gaga,” a paper delivered by Roseanne Giannini Quinn, a lecturer in English at Santa Clara University. It was delivered at the National Women’s Studies Conference in Denver in November.

Seriousness in this case meant disapproval. The paper has not been published, nor was I able to obtain a copy from Quinn, but her abstract in the conference program says it “takes as its starting point the degrading representation of Italian American women in the current popular television reality show 'The Jersey Shore,' ” using this as a point of departure to consider various “feminist and gay cultural icons” who both challenged “destructive stereotypes as well as often participated in the mass media reinforcement of them.”

And in May, the University of Oklahoma offered an online intercession course called “Jersey Shore-GRC: Depictions of gender, race and class on the shore,” which will be repeated in August. The instructor is Sarah E. Barry, a graduate teaching assistant for first-year English composition. The catalog description, while useful as a survey of likely topics in “Jersey Shore” studies, is altogether horrifying as a piece of prose.

Here it is in full, and minus any [sic]s: “We will look at European, specifically the Italian diaspora and how American’s response to the nations globalization and subsequent cultural contact constructed the image of the Italian-American, beginning in the 19th century and how that compares to images and personalities of the Jersey Shore cast. Additionally we will explore how aspects of critical theory, specifically gender studies, understanding of the self and the ‘Other’, class conflict and racial issues come together to reflect how popular culture views and interprets socio-economic and socio-historic conditions and how the youth is responding to these conditions. Finally, we will look at the impact this phenomenon is having on society and youth identity formation.”

Oh well, cohesive syntax isn’t everything. While trying repeatedly and unsuccessfully to contact Barry to find out how the course had gone, I did manage to get in touch with one of the featured speakers now confirmed for the University of Chicago conference. Alison Hearn, an associate professor of information and media studies at the University of Western Ontario, is at work on a book called Real Incorporated: Explorations in Reality Television and Contemporary Visual Culture.

“I have not written about ‘Jersey Shore,’ per se,” she told me by e-mail, “but will for this conference.” She described her area of interest as “the relationship of reality television to broader political, cultural and economic concerns - specifically the changing world of work and its impact on processes self-making, or, more aptly in a world marked by promotional concerns, self-branding.”

Certainly the denizens of “Jersey Shore” have developed some expertise in the commodification of lifestyle and personal identity. They endorse various products (alcohol, clothing, tanning methods) and have book details. In papers from the International Journal of Media and Cultural Politics and the Journal of Consumer Culture, Hearn writes about “the spectacularization of self” that is both fostered and manifested by reality TV, among other media forms.

The audience participates in the “spectacularization” just as much as the “stars.” (You, too, can be a guido.) In one of her papers, Hearn describes meeting with a group of teenagers in Boston who show themselves eager to explain just how suitable their personalities make them as potential cast members for a reality TV program. Reflecting on this encounter, she cites a passage from one of Jean Baudrillard’s later essays: “We are no longer alienated and passive spectators, but interactive extras; we are the meek, lyophilized members of this huge ‘reality show.’ ”

Here, a gloss on Baudrillard's more obscure word-choice proves illuminating: “Lyophilized, meaning ‘freeze-dried,’ seems an apt description of the responses I receive that day in Boston,” writes Hearn; “they are pre-set, freeze-dried presentations of self, molded by prior knowledge of the dictates of the reality television genre and deployed strategically to garner attention, and potentially, profit.”

Abstracts for the Chicago conference are welcome through August 1. Showalter tells me he is receiving no academic credit for the undertaking, which has a shoestring budget. He received $2580 from The Uncommon Fund, a student-run initiative at U of C to “support creative ideas that may otherwise not be implemented at all.” Various academic departments have made verbal commitments to lend modest support this fall, though the paperwork remains to be done.

Has anyone from “Jersey Shore”– whether in the cast or on the production crew – expressed any interest in the conference so far?

“I wish!” he answers. “It would be fascinating to get their perspectives on the conception and development of the show. I’d also like to hear their answers to some of the criticisms from Italian-American groups and from officials in New Jersey who complain that the cast members aren’t even from the area.” It turns out most of them are actually New Yorkers.

The show often generates an intense, even visceral, response. (I have never gotten through more than a few minutes of it, but did watch as the residents of South Park formed an alliance with Al Qaeda to drive out the Jerseyites who were invading their town.) Then again, any cultural phenomenon capable of generating both strong negative affect and a tremendous revenue stream may prove “good to think with,” to borrow Claude Levi-Strauss’s phrase.

“If anything,” Showalter told me, “the vehemence aimed at ‘Jersey Shore’ has only made me more interested in watching the show closely. I've also enjoyed observing the cast members strike out beyond the series into other markets and products. I think Snooki's novel, A Shore Thing, is a great example of this; what appears at first to be a purely empty money-maker actually contains a rather complex and frenetic plot line, not to mention all kinds of subliminal autocriticism from Snooki herself. The universe of endorsements and branded products that has grown up around ‘Jersey Shore’ has made it a much more rich and engaging phenomenon.”

During an interview with the Maroon, U of C’s student paper, Showalter noted “the danger of people just taking Pop-Culture Phenomenon X and Obscure Author Y and trying to combine them together.” So far, abstracts for papers have been submitted by scholars working in English, media studies, sociology, and gender studies.

“There’s been nothing on issues of ethnicity and race,” he told me, “which is really surprising.” It certainly is. If anything, it seems like the topic for a whole panel. In an oft-cited remark, “Jersey Shore” cast member and literateur Snooki has stated, “I’m not white…[I’m] tan.” Discuss.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

End Large Conferences

I’ll play Marc Antony. I have not come to praise large conferences, but to bury them. It is my opinion that mega humanities conferences are way past their sell-by date. For senior faculty the only reason to go is to schmooze with old friends; for junior faculty they are an onerous duty, and for graduate students they are a rip-off for which professional organizations ought to be collectively ashamed.

First codicil: I speak exclusively of humanities conferences, as they are the only ones I know firsthand. Friends in computing and the sciences tell me that collaborative efforts arise from their conferences. I’m willing to believe them. Maybe it’s a cultural thing. Most humanities people find it so hard to collaborate that their wills stipulate that their notes go with them to the grave.

Second codicil: I have only myself to blame for recent travails. I didn't need to go to my unnamed conference, but I got it into my head that it would be fun. I was wrong. It serves me right for violating my principles.

Five years ago I concluded that humanities conferences were out of touch with the times and vowed to attend only smaller regional meetings with less cachet, but more satisfaction. But I didn’t listen to me. Instead I spent four days and a considerable wad of cash jostling among a throng of over three thousand. I returned home more akin to Ponce de Leon, who sought the Fountain of Youth and found mostly dismal swampland. Sound harsh? See if any of these observations resonate with your own.

Problem One: Outmoded Presentations

We live in the communications age, but the memo apparently never circulated among those studying the liberal arts. For reasons arcane and mysterious, humanities scholars still read papers. That’s tedious enough at a small conference where one might attend six three-paper presentations. At my recent conference, sessions commenced at 8 a.m. and ran past 10 p.m. One could have conceivably attended 30 sessions and heard 90 or more papers, though the only ones with the stamina to attend more than six or seven sessions were either posturing or desperate.

I wanted my four-day sojourn to introduce me to new ideas, concepts, and teaching modules, but the reality of such a grueling schedule is that I was running on fumes by the end of day one. It would have helped if presenters took advantage of new technology, but things seldom got more flash than PowerPoint, a program that, alas, seems to encourage more reading. Let me reiterate something I’ve said for years: the death penalty should apply to those who read anything from a PowerPoint slide other than a direct quote. It's an academic conference, for crying out loud; assume your audience is reasonably proficient at reading! Seriously, does anyone need to fly across the country to listen to a paper? Why not do as science conferences have done for years: post papers online and gather to have a serious discussion of those papers?

The mind-numbing tedium of being read to for four days is exacerbated by the fact that many humanities scholars have little idea about the differences between hearing and reading. If you construct a paper that’s so highly nuanced that understanding it rests upon subtle turns of phrase or complicated linguistic shifts, do not look up from your paper with a wan smile indicating you are enamored of your own cleverness; go back to your room and rewrite the damn thing. Audience, clarity, and coherence are pretty much the Big Three for speech and composition, unless one's audience is the International Mindreaders' Society. By the way, is there something wrong with using a map, providing a chart, or summarizing a work that few in the room are likely to have read? And do bother to tell me why your paper matters.

I actually heard several very exciting papers, but most of the offerings were dreadful. Note to young scholars: stop relying on the Internet and check out journals that predate 1995 before you proclaim a “discovery.” And if you really want to stand out, work on your shtick. Guess which papers I remember? Yep -- those in which the presenter did more than read to me.

Critical note to young scholars: Want to turn off everyone in the room? Be the person who doesn’t think that the 20-minute limit applies to you. Nothing says "non-collegial" more clearly.

Problem Two: Expense

Another reason to rethink conferences is that they cost an arm and a leg to attend. I had partial funding from my university because I was presenting -- and no, I bloody well did not read my paper -- but I was still out of pocket for quite a hunk of cash. If you attend a humanities conference and want to stay anywhere near the actual site of the event, plan on $150 per night for lodging in a soulless franchise hotel with windowless conference rooms and quirky technology, $20 per day for Internet access, another $200 for conference fees, roughly $500 for airfare, at least $50 for taxis to and from the airport -- almost no U.S. city has a convenient shuttle service anymore -- and money for whatever you plan on eating.

Budget plenty for the latter if your conference is in what is glibly called a Destination City. That’s shorthand for a theme area marketing itself as unique, though it’s actually a slice of Generica surrounded by shops and restaurants identical to those found in suburban malls in every way except one: captive audiences equal higher prices. (One small example: the Starbucks inside the pedestrian precinct at my hotel charged a buck more per cup than the one on the street 100 yards away.) Do the math and you can see that you can easily drop a few grand on a megaconference. (That’s what some adjuncts are paid per course!)

An immediate cost-saving adjustment would be to confine conferences to airline hub cities such as New York, Chicago, Los Angeles, Atlanta, and Houston. The moment the conference locates to a (not my term) "second-tier" city, allot another few hundred dollars for "connecting flights," a term used by the airline industry because it sounds nicer than saying you’ll spend six hours waiting in a hub, after you’ve sprinted through the airport like Usain Bolt for your next flight, found the gate closed, and retreated to the rebooking counter.

Problem Three: Victimized Grad Students

I'm a parsimonious Scot who resents spending money on boring hotels and lousy food, but I can afford it when I have to. Grad students can’t. A major way in which megaconferences have changed in the past several decades is that there’s considerably less balance between senior scholars, junior colleagues, and graduate students. (Senior scholars used to accompany the latter two in a mentor capacity.) Now there is just a smattering of senior and junior scholars, and they’re often holed up in hotel suites conducting interviews. Whenever they can, search committee members flee the conference and rendezvous with old friends. They might attend a session or two. Unless they have to be there, there aren’t many junior colleagues in attendance at all because they're busy getting material into publication and they can meet presentation expectations at cheaper regional meetings, or save their dough and go to prestigious (-sounding) international gatherings.

So who’s left? Graduate students. Lots of graduate students. So many that conservationists would recommend culling the herd if they were wild mustangs. Grad students have always gone to conferences in hopes of making their mark, attracting attention, and meeting people who can help them advance. That was the way it was done -- 20 years ago. Now network opportunities are slimmer. Whom do they meet? Mostly other grad students, often those massed outside of interview rooms.

Of all the antiquated things about large conferences, the "cattle call" interview is the most perverse. These were barbaric back in the days in which there were jobs; now they’re simply cruel. At least a third of attendees at my conference were grad students from a single discipline: English. As has been discussed many times on this site, most of them shouldn't be in grad school in the first place. How many of the thousand-plus English grad students can realistically hope to get an academic job of any sort?

The Modern Language Association predicts that only 900 English jobs will come open for all of 2011. That’s 900 in all specialties of English, the bulk of which will be in writing and rhetoric, not Austen and Proust. Will a fifth of those at the conference get a job? The odds are long. It's probably more like half of that, and if we're talking about a good job, slice it in half once more. So why ask strapped grad students to attend expensive conferences for 15-minute preliminary interviews? Do a telephone interview, for heaven’s sake; it’s kinder on both grad students and search committees.

As I did as a grad student, many young hopefuls pooled resources and economized where they could, but the sad truth is that the vast majority of attendees spent a small fortune on a gamble whose odds aren't much greater than buying lottery tickets. Are associations playing the role of enabler to grad student delusions? Yes. Here’s another thought: Instead of holding a big conference, sponsor a teleconference. Charge a fee for uploads, but give speakers one-year access to the URL, which they can make available to potential employers. Use the savings to the association to lobby for more tenure-track faculty.

Problem Four: No-Shows

You spend lots of money, you sit through desultory talks, and head off to the one or two sessions that made you want to attend the conference in the first place. What do you find? It’s been canceled because only one of the presenters showed up, and that paper was combined with several others of sessions that suffered the same fate. Didn’t you see the 3x5 card tacked to the conference bulletin board?

As noted above, I’m in favor of putting large conferences to rest. But If we insist on having them, let’s at least make sure they’re as advertised. O.K., things do happen, but in most cases missing presenters are simply AWOL. I know it smacks of McCarthyism, but I’ve come to support the idea of a data bank of no-shows that employers, conference planners, and deans can check.

Problem Five: Urban Sprawl

What’s the point of a conference that’s so big it’s inaccessible? I walked between two different hotels to attend sessions and pored over a Britannica-sized program to locate them. Conference attendees were housed in four "official" hotels and untold numbers of others. With round-the-clock sessions and decentralization, the few networking opportunities that did exist were logistically difficult. It took me two entire days to find my old friends, let alone new folks I wanted to engage. I met two interesting people at the airport. I never saw them again.

In Praise of Small Conferences

There are other problems I’ll leave for now, including the gnawing suspicion that some big conferences have become sinecures for "insiders" who have become "players" within associations. Let’s just say that there is a serious disconnect between how the big conferences operate and what makes sense in the changing world of academe.

Teleconferences with real-time discussion groups and online forums would be one good starting point for reform; providing more resources for regional and local conferences would be another. Small gatherings have issues of their own -- no-shows, sparsely attended sessions, overreliance on volunteers -- but they compensate by offering intimacy, good value, face-to-face feedback, and easier opportunities to network. It's time to give these the cachet they deserve. The big conference is like a one-size-fits-all t-shirt; it simply doesn’t accessorize most people. I’m done. For real. Unless I get funding for an exotic overseas meeting. (Just kidding!)

Author/s: 
Rob Weir
Author's email: 
info@insidehighered.com

Rob Weir, who writes the "Instant Mentor" column for Inside Higher Ed's Career Advice section, has published six books and numerous articles on social and cultural history, and has been cited for excellence in teaching on numerous occasions during his 20 years in the college classroom.

Why I Am Not Radical Enough

As a teacher of rhetorical studies, I've been trained to think about the differences between audiences and how to adapt one's messages to address those differences. Of course, having earned one's credentials in "the art of persuasion" and (presumably) possessing the intellectual tools of audience adaptation doesn't necessarily mean one can do it well, and last fall I really stepped in it. What have I learned? Sometimes it is permissible to retreat from a more straightforward -- if not radical -- introduction to queer theory to a classic, liberal politics of toleration or humanism when teaching undergraduates because we no longer live in an environment that protects academic freedom. Although Kurt Cobain did once sing, "what else should I say/everyone is gay," sometimes students are not ready to interrogate what that means, and they'll make their parents call deans and chairs attempting to get you fired if you try to teach them.

Here's the set-up: For three years I worked as an assistant professor at  Louisiana State University in Baton Rouge. Having moved there from the University of Minnesota (where I did my graduate work), adapting to Louisiana students took some time, and the culture shock I experienced was intense. Gradually I acclimated to the sight of public, drunken nudity and that charming, Southern hostility toward my so-called Midwestern political correctness. My experiences in Louisiana taught me that although the students claimed a conservative, religious politics, they were quite familiar and accepting of "alternative lifestyles," and I often had to resort to pretty wild examples in the classroom to keep their attention and to get them to engage queer theory beyond the level of "whatever!" and "so what?"

Friends and colleagues were often surprised when I told them that my students took to "controversial" theoretical perspectives, such as the critical work of Judith Butler on gender and sexual identity, quite well. One semester -- as I was teaching the Kinsey scale to supplement Laura Mulvey's theory of cinematic pleasure - -I just asked my students: "Y'all don't seem too bothered by this material; why is that?" One of my repeat-students said in a sardonic tone, "Dude: Mardi Gras?" My Louisiana students had "seen it all," and probably from a very young age many of them learned how to hang up their hetero-hang-ups, at least for a week or two before Ash Wednesday and Lent so that they could properly enjoy all the parades and street parties.

Obviously, I had a lot of adapting to do when I took my second job, at the University of Texas at Austin. I still do not have a good "feel" for the students at my new university, but I think in general it is fair to describe the students here as more right-leaning politically and more conservative in their thinking about lifestyle. Regardless, to my delight and horror, as I began teaching the queer theory unit of my Rhetoric and Popular Music course I heard the same wild examples exiting my mouth in seemingly automatic fits of charismatic teach-o-mania. I still assigned readings like Cynthia Fuchs' fabulous essay on queercore, "If I Had a Dick: Queers, Punks, and Alternative Acts." But I quickly learned that when one combines reading material that attempts to unravel binaries and my own ambiguously (and strategically) queer teaching persona in a "Bush Country" classroom, one should expect a little hostility. I expected it, really I did. I simply did not expect to catch hell from a parent.

The day after I lectured on heterosexist norms in heavy metal music videos, I was summoned to the principal's office to get a talking to. Apparently a student's mother was among the sea of faces in my large lecture class that day, and was expressly appalled at my queer "agenda." In an email that my chair shared with me, the mom said that it was obvious I was attracted to both men and women and therefore "no one is safe." For the class I developed a field trip ethnography project at a well-known, Austin 18-and-up punk club. This parent said that I forced my students to go to a "gay bar." Ultimately, I was characterized as unprofessional, as teaching filth, and as trying to recruit students for the "gay cause."

Needless to say, my meeting with the chair was painful and I was fearful, although one couldn't blame him. He did the best "you have academic freedom, but" talk I've yet to hear. Even so, I was told the story about "that professor" who was fired from such and such a department for "creating a hostile classroom environment." I was told to de-personalize my teaching and reminded that I did not have tenure yet and that teaching evaluations were very important to the tenure review process.

Since that meeting I have changed my teaching a bit and am more mindful of the power of students and parents have to take out an assistant professor whom they do not like, especially under the aegis of sexual harassment. Us juniors should also remember that many of our deans are (necessarily) insulated from the classroom and by force of situation are often more sympathetic to students and parents in our age of the "cultural wars" and "zero tolerance."

Immediately after the incident, I was worried about protecting my teaching assistants. One of them was slated to deliver a lecture on the interchangeability of sex organs in the music and art of Peaches, a controversial and polyamorous figure who had an underground dance hit with "Shake Yer Dix (Shake Yer Tits)." Although I knew I was a bit oversensitive after the talking-to with my chair, I decided to send a preemptive e-mail message to the 130-student class in an effort to spare us more grief. Here is the text of that message, edited to protect the innocent and please the legal eagles:

Greetings Class,

Your resident instructor here with some background commentary on your readings for Tuesday, as they directly challenge cultural assumptions of “normalcy.” We will be discussing the field of “queer theory,” which grew out of the heated discussions of feminism in the 1980s and 1990s regarding sexual desire and the relationship between social identity and biology. We’ll spend some time discussing the term “queer” itself -- which is confusing--but for the moment let us simplify a lot of the concern of queer theory to a series of questions: to what extent does biology and genetics form a materialist basis for gender and sexual identity? In other words, are we born gay, straight, or somewhere between those two poles? Where do the chemicals and biological predispositions end and where does culture begin? Why is sexual identity such an obsession in the United States (e.g., what’s the big deal about the proposed Texas amendment to ban gay marriage)? Finally, why are we so interested as a culture in these questions?

The latter question may resonate somewhat. To put it like my own granny does, “who gives a d*&! what you do in the privacy of your own home?” Or to reduce it to a question I received some years ago from a student, “who cares?”The answer to the last question is this: if you identify as traditionally masculine or feminine or “straight,” for whatever the reason, you have a much easier time in our society that if you do not. Sometimes having someone broadcast their sexual identity in your face gets tiresome. My point, though, is this: If you were deemed socially “abnormal,” it hurts, and it can be empowering to say, unabashedly and unashamedly, “this is me!”

Indeed, not being “normal” in any respect first leads to torment (think back to your own experiences in middle school, hey?), and then ridicule and rejection. The big problem is that being different can get you killed (e.g., Matthew Sheppard, Tina Brandon, hundreds of thousands of folks without white skin, Jews ... Jesus, alas, we are not wont for examples in history). So the answer to the question “who cares?” is “those folks who are more likely to suffer," as well as the people who love them. Although you might think you are pained reading this stuff, feminism and queer theory are really about ending human suffering. That’s really what it comes down to folks: people suffer and die because they are “different.” If there is a tacit ethical teaching to this literature, it is the lesson of tolerance.

Feminism and queer theory concern thinking about ways to keep people from getting hurt because they are not what society deems “normal” in regard to their gender and their sexual desire. Millions of folks live realities that are fraught with pain and hardship, and only because they harbor a preference for someone of the same gender or sex (or of a different race, and so on). As we saw with heavy metal, popular music practices are a central way in which these issues are expressed and negotiated in our culture. For reasons we discussed with Attali and Adorno (the irreducible humanness of music, that “noise” factor), as a powerful form of human expression, music can be used to create a kind of force field for expressing, deconstructing, constructing, and establishing a gamut of identities. Music, in other words, can unsettle our gendered and sexual identities (e.g., glam rock; queercore) as much as it cam reestablish or reinforce them (e.g., Enya; Nas).

As we tread into this territory I need to underscore a few things about the ultimate purpose for assigning this material. Although it may appear at times your goodly instructor is endorsing or promoting this or that approach, requiring readings and lecturing on queer theory is not to be taken as an ENDORSEMENT or propaganda for joining the some sort of Gay Borg or ominous Lesbo Deathstar (nor does lecturing on materialism entreat you become a socialist). Exposing you to this material, or any discussion of non-straight sexual identity, is not designed to “convert” you; it’s not, in other words, sermonic. Rather, it’s functionally informative AND designed to challenge settled, “normal” beliefs about what is and isn’t appropriate in our society (indeed, what is or is not appropriate to discuss in the classroom!). You can think about it this way: the classroom should be the opposite of the church, synagogue, or mosque. In class, we challenge our settled ideas about normalcy and look beyond deity or the physical sciences for alternative explanations for social practices. In the house of God, we reaffirm and reestablish our settled ideas and beliefs. And in some ways, you cannot have the latter without the former.

Finally, I recognize this message is crafted for a “straight” audience, so let me give a shout-out to those among you who are forced to switch codes in the classroom (which, as you well know, is also almost always oriented to the “hetero” world): if you do not identify as “normal,” welcome. I hope the readings and lectures on identity -- sex, gender, and sexual orientation -- are affirming and that classroom is a safe space in which you see your reality represented.

Now, mindful of the audience of Inside Higher Ed, I needn't detail at any length why this e-mail makes me cringe. It represents my frame of mind, worried about student hostility toward my assistants and (however unrealistically) worried about losing my job. I shared my e-mail message with colleagues, and my friend Ken Rufo  detailed the teaching pickle it created better than I can:

Here’s a philosophy-of-pedagogy question, one that I confront quite a bit, and am always unsure about negotiating. The letter . . . indicates the problems with difference from a fairly conventional, liberal perspective. But this conceptualization of difference isn’t exactly simpatico with a lot of [the theory you teach and publish].... you’ve invested a lot of time and effort making a case for [the value of psychoanalytic theory] to rhetorical studies, and so I wonder how you negotiate the complexities of a certain worldview with the necessities of teaching, or if you feel any tension there at all?

What my e-mail does, in other words, is reestablish the same liberal-humanist politics of toleration that a lot of queer theory tries to challenge and dismantle: What if there is no common humanness to us?  What if this binary logic of same and different is a causal factor in homophobic violence? Aren't these sorts of questions the kind posed by the thinkers we are reading for class?

After I posted the email to the class and talked to my friends about it, I decided I would simply address the issue directly in class, turning the e-mail into a teaching exemplar. Before I could lecture about the e-mail, however, my teaching assistant lectured on Peaches, and she received a standing ovation when she finished. That reaction told me that perhaps the e-mail had a positive effect. On the following teaching day, I asked the students to bring a copy of my e-mail message to class, and we went through it together and we discussed why it was a problematic message, locating binaries and troublesome assumptions. In my mind, this was the best way to "recover" an important teaching of queer theory while, at the same time, eating my cake too.

I cannot say that going over the e-mail helped most of my students understand the problem with liberal humanist approaches to identity. Some of them understood what I meant when I confessed that I "retreated to humanism," while others clung tightly to their notions of a universal equality rooted in phallogocentrism. Nevertheless, I'm coming to the position that I should send variations of this e-mail to my class every time I teach queer theory. I feel slightly dirty doing it because the move represents a bait-and-switch pedagogy, but it may be the best way for me to adapt to my Texan classroom while retaining my tendency to personalize theory. I guess, then, I'm not radical enough. But I want to keep my job.

I'll admit as well that deep down there is a part of me that cannot let go of the notion that liberal humanism keeps some people alive -- a faith I'd like to think has some affinity to Spivak's notion of momentary solidarity in "strategic essentialism" for social and political action. I say I'd like to think it has an affinity, but perhaps I'm more sheepish and cowardly than I'd like to admit? Nevertheless, institutional pressures, the increasing erosion of academic freedom and the decay of tenure protections, the general, cultural hostility toward the professoriate, parental and alumni demands and the PTA-ification of the college and university, and the consumerist drive-thru window attitude about teaching that some students harbor, these trends collectively suggest that the teach-it-and-then-deconstruct-it approach may be the baby bear's porridge pedagogy of our time.

Author/s: 
Joshua Gunn
Author's email: 
info@insidehighered.com

Joshua Gunn is assistant professor of communication studies at the University of Texas at Austin.

Pages

Subscribe to RSS - Cultural studies
Back to Top