Cultural studies

Ryan Gosling pick-up line meme reaches academe

Section: 
Smart Title: 

Satirical blogs explore whether a Hollywood sex symbol can make academic pick-up lines seem smooth.

Not So Foreign Languages

Smart Title: 

Citing demographic and pedagogic trends, growing number of colleges rename departments "world" or "modern" languages.

Approach and Avoid

In 1939, the French anthropologist Michel Leiris published a memoir called Manhood in which he undertook an inventory of his own failures, incapacities, physical defects, bad habits, and psychosexual quirks. It is a triumph of abject self-consciousness. And the subtitle, “A Journey from Childhood into the Fierce Order of Virility,” seems to heighten the cruelty of the author’s self-mockery. Leiris portrays himself as a wretched specimen: machismo’s negation.

But in fact the title was not ironic, or at least not merely ironic. It was a claim to victory. “Whoever despises himself, still respects himself as one who despises,” as Nietzsche put it. In an essay Leiris wrote when the book was reissued after World War II, he described it as an effort to turn writing into a sort of bullfight: “To expose certain obsessions of an emotional or sexual nature, to admit publicly to certain deficiencies or dismays was, for the author, the means – crude, no doubt, but which he entrusts to others, hoping to see it improved – of introducing even the shadow of the bull’s horn into a literary work.”

By that standard, Leiris made the most broodingly taciturn character in Hemingway look like a total wuss.

The comment about passing along a technique to others -- “hoping to see it improved” -- now seems cringe-making in its own way. Leiris was addressing a small audience consisting mainly of other writers. The prospect of reality TV, online confessionals, or the industrialized production of memoirs would never have crossed his mind. He hoped his literary method -- a kind of systematic violation of the author's own privacy -- would develop as others experimented with it. Instead, the delivery systems have improved. They form part of the landscape Wayne Koestenbaum surveys in Humiliation, the latest volume in Picador’s Big Ideas/Small Books series.

Koestenbaum, a poet and essayist, is a professor of English at the City University of New York Graduate Center and a visiting professor in the painting department of the Yale School of Art. The book is an assemblage of aphoristic fragments, notes on American popular culture and its cult of celebrity, and reflections on the psychological and social dynamics of humiliation – with a few glances at how writing, or even language itself, can expose the self to disgrace. It’s unsystematic, but in a good way. Just because the author never quotes Erving Goffman or William Ian Miller is no reason to think they aren’t on his mind. “I’m writing this book,” he says early on, “in order to figure out – for my own life’s sake – why humiliation is, for me, an engine, a catalyst, a cautionary tale, a numinous scene, producing sparks and showers…. Any topic, however distressing, can become an intellectual romance. Gradually approach it. Back away. Tentatively return.”

The experience of humiliation is inevitable, short of a life spent in solitary confinement, and I suppose everyone ends up dishing it out as well as taking it, sooner or later. But that does not make the topic universally interesting. The idea of reading (let alone writing) almost two hundred pages on the subject will strike many people as strange or revolting. William James distinguished between “healthy mindedness” (the temperament inclined to “settl[ing] scores with the more evil aspects of the universe by systematically declining to lay them to heart or make much of them…. or even, on occasion, by denying outright that they exist”) and “sick souls” (which “cannot so swiftly throw off the burden of the consciousness of evil, but are congenitally fated to suffer from its presence”). Koestenbaum’s readers are going to come from just one side of that divide.

But then, one of the James’s points is that the sick soul tends to see things more clearly than the robust cluelessness of the healthy-minded ever permits. As a gay writer -- and one who, moreover, was taken to be a girl when he was young, and told that he looked like Woody Allen as an adult -- Koestenbaum has a kind of sonar for detecting plumes of humiliation beneath the surface of ordinary life.

He coins an expression to name “the somberness, or deadness, that appears on the human face when it has ceased to entertain the possibility that another person exists.” He calls it the Jim Crow gaze – the look in the eyes of a lynching party in group photos from the early 20th century, for example. But racial hatred is secondary to “the willingness to desubjectify the other person” – or, as Koestenbaum puts it more sharply, “to treat someone else as garbage.” What makes this gaze especially horrific is that the person wearing it can also be smiling. (The soldier giving her thumbs-up gesture while standing next to naked, hooded prisoners at Abu Ghraib.) The smile “attests to deadness ... you are humiliated by the refusal, evident in the aggressor’s eyes, to see you as sympathetic, to see you as a worthy, equal subject.”

Deliberate and violent degradation is the extreme case. But the dead-eyed look, the smirk of contempt, are common enough to make humiliation a kind of background radiation of everyday social existence, and intensified through digital communication “by virtue of its impersonality…its stealth attack.” An embarrassing moment in private becomes a humiliating experience forever if it goes viral on YouTube.

“The Internet is the highway of humiliation,” Koestenbaum writes. “Its purpose is to humiliate time, to turn information (and the pursuit of information) into humiliation.” This seems overstated, but true. The thought of Google owning everyone’s search histories is deeply unsettling. The sense of privacy may die off completely one day, but for now the mass media, and reality TV most of all, work to document its final twitches of agony. “Many forms of entertainment harbor this ungenerous wish: to humiliate the audience and to humiliate the performer, all of us lowered into the same (supposedly pleasurable) mosh pit.”

A study of humiliation containing no element of confession would be a nerveless book indeed. Koestenbaum is, like Leiris, a brave writer. The autobiographical portions of the book are unflinching, though flinch-inducing. There are certain pages here that, once read, cannot be unread, including one that involves amputee porn. No disrespect to amputees intended, and the human capacity to eroticize is probably boundless; but Koesternbaum's describes a practice it never would have occurred to me as being possible. With hindsight, I was completely O.K. with that, but it's too late to get the image out of my head now.

Humiliation counts on “shame’s power to undo boundaries between individuals,” which is also something creativity does. That phrase comes from Koestenbaum tribute to the late Eve Kosofsky Sedgwick towards the end of the book. He evokes the memory of her friendship at least as much as the importance of her foundational work in queer theory – though on reflection, I’m not so sure it makes sense to counterpose them. Kosofsky’s ideas permeate the book; she was, like Koestenbaum, also a poet; and Humiliation may owe something to A Dialogue on Love, the most intimate of her writings.

But it’s more reckless and disturbing, because the author plays off of his audience's own recollections of humiliation, and even with the reader's capacity for disgust. There’s a kind of crazy grace to Koestenbaum’s writing. He moves like a matador working the bull into ever greater rage -- then stepping out of the path of danger in the shortest possible distance, at the last possible moment, with a flourish.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

It's a Jersey Thing

No one would think of the call for papers as a literary genre. But the CFP can be distinguished from the usual run of academic memoranda by its appeal to the reader’s curiosity, ambition, and capacity to daydream -- and occasionally by its test of one’s power to suspend disbelief.

A few days ago, I came across the Facebook page for the University of Chicago Conference on Jersey Shore Studies. It appealed for abstracts of 500 to 600 words for “the first conference to interrogate the landmark MTV reality television show ‘Jersey Shore,’ ” to be held in October.

The program, which debuted in late 2009, follows one of the standard templates of reality TV, “young people living in a group house.” Video cameras document the usual inebriation, hot-tub sex, personal conflicts, and arias of bleepable language. What sets the show apart, I understand, is its exploration of “the guido lifestyle,” in which hair gel and year-round full-body tanning play an important part. Female guidos call themselves “guidettes.” The National Italian-American Foundation is not amused, not one little bit. Be that as it may, “Jersey Shore” is MTV’s highest rated show. Its fourth season begins in August.

“The fact that this conference is occurring may very well be a sign of the downfall of Western civilization,” said one Facebook commentator. Another just wrote, “oh dear god why.” Then again, 706 users have indicated that they plan to attend. A Facebook commitment is not one of society’s stronger bonds; still, this suggests rather more visibility than most academic conferences receive. And at least three people have chimed to say that they were already engaged in "Jersey Shore" scholarship and are glad to know about the conference. Clearly the field is making great strides.

The idea of a conference on "Jersey Shore" being held at the very institution where Alan Bloom wrote The Closing of the American Mind seems just a little too good to be true. (See also Jurgen Habermas’s Twitter account.) To find out how serious the whole thing might be, I got in touch with David Showalter, whose email address appeared on the CFP.

We spoke by phone. The short answer is, perfectly serious. Showalter has just finished his junior year as an undergraduate in the tutorial studies program, which is described by the University of Chicago as “an alternative for students who propose a coherent course of studies that clearly will not fit within a regular major.” When he came up with the idea for the conference about year ago, he says, friends thought he was joking or being eccentric. But he has received $3,000 in funding, and has received about 10 abstracts so far.

Before anyone gets too excited, let me make clear that Showalter’s pursuit of “a coherent course of studies that clearly will not fit within a regular major” does not mean that the University of Chicago is giving him credit for watching MTV.

“I don't study popular culture in my normal academic program,” Showalter told me. “My course is on issues of crime and punishment, particularly criminal law surrounding vice activities and sex offenses. I've come to an awareness of the literature on reality television almost wholly through my fascination with 'Jersey Shore' and the books I've found in the University of Chicago library system and through interlibrary loan. So I can't claim any sort of authoritative knowledge about the state of the discipline of television studies, or any expertise on the existing literature.”

Please note the earnestness. Before saying anything more about the conference, or about "Jersey Shore" itself for that matter, it bears stressing that at no point in our exchanges by phone or e-mail did Showalter seem to manifest any of the so-called “pop culture irony” that has become such a prevalent mode of self-protecting self-constitution in an era of almost unbearably dense mass-media saturation. It comes in many finely graded variants. And after 20 years of it, all of them make me tired. Showalter enjoys the show and wants to think about it -- he doesn’t merely “enjoy” the show and want to “think” about it.

Demurrals notwithstanding, Showalter quickly shows an extensive familiarity with the media-studies and social-science literature on reality television. "Many criticize 'Teen Mom' (another MTV show) for glamorizing teenage motherhood," he notes in an e-mail message, "and thereby encouraging teenagers to become pregnant. But a report by the Public Religion Research Institute claims that people who watch shows like 'Teen Mom' are actually more supportive of abortion rights and believe abortion to be morally acceptable at higher rates than non-viewers. The relationship between reality television and its viewers is much more complicated than simple approbation of the content of the shows, and so viewer response data can be quite useful in adding nuance to that picture."

Now, to be honest, I had never even heard of "Teen Mom," let alone considered its social impact. But somebody needs to do it. The possibility that "Jersey Shore" merits careful thought seems rather counterintuitive, but Showalter is clearly someone to make the case. His conference will be serious, not a festival of agnostic hipness.

But what is there to be serious about? It turns out that a few sprouts of "Jersey Shore" studies had already appeared before Showalter first circulated his CFP. The earliest entry in some future bibliography of the field will probably be “Sailing Away from The Jersey Shore: Ethnic Nullification and Sights of the Italian American Female Body from Connie Francis to Lady Gaga,” a paper delivered by Roseanne Giannini Quinn, a lecturer in English at Santa Clara University. It was delivered at the National Women’s Studies Conference in Denver in November.

Seriousness in this case meant disapproval. The paper has not been published, nor was I able to obtain a copy from Quinn, but her abstract in the conference program says it “takes as its starting point the degrading representation of Italian American women in the current popular television reality show 'The Jersey Shore,' ” using this as a point of departure to consider various “feminist and gay cultural icons” who both challenged “destructive stereotypes as well as often participated in the mass media reinforcement of them.”

And in May, the University of Oklahoma offered an online intercession course called “Jersey Shore-GRC: Depictions of gender, race and class on the shore,” which will be repeated in August. The instructor is Sarah E. Barry, a graduate teaching assistant for first-year English composition. The catalog description, while useful as a survey of likely topics in “Jersey Shore” studies, is altogether horrifying as a piece of prose.

Here it is in full, and minus any [sic]s: “We will look at European, specifically the Italian diaspora and how American’s response to the nations globalization and subsequent cultural contact constructed the image of the Italian-American, beginning in the 19th century and how that compares to images and personalities of the Jersey Shore cast. Additionally we will explore how aspects of critical theory, specifically gender studies, understanding of the self and the ‘Other’, class conflict and racial issues come together to reflect how popular culture views and interprets socio-economic and socio-historic conditions and how the youth is responding to these conditions. Finally, we will look at the impact this phenomenon is having on society and youth identity formation.”

Oh well, cohesive syntax isn’t everything. While trying repeatedly and unsuccessfully to contact Barry to find out how the course had gone, I did manage to get in touch with one of the featured speakers now confirmed for the University of Chicago conference. Alison Hearn, an associate professor of information and media studies at the University of Western Ontario, is at work on a book called Real Incorporated: Explorations in Reality Television and Contemporary Visual Culture.

“I have not written about ‘Jersey Shore,’ per se,” she told me by e-mail, “but will for this conference.” She described her area of interest as “the relationship of reality television to broader political, cultural and economic concerns - specifically the changing world of work and its impact on processes self-making, or, more aptly in a world marked by promotional concerns, self-branding.”

Certainly the denizens of “Jersey Shore” have developed some expertise in the commodification of lifestyle and personal identity. They endorse various products (alcohol, clothing, tanning methods) and have book details. In papers from the International Journal of Media and Cultural Politics and the Journal of Consumer Culture, Hearn writes about “the spectacularization of self” that is both fostered and manifested by reality TV, among other media forms.

The audience participates in the “spectacularization” just as much as the “stars.” (You, too, can be a guido.) In one of her papers, Hearn describes meeting with a group of teenagers in Boston who show themselves eager to explain just how suitable their personalities make them as potential cast members for a reality TV program. Reflecting on this encounter, she cites a passage from one of Jean Baudrillard’s later essays: “We are no longer alienated and passive spectators, but interactive extras; we are the meek, lyophilized members of this huge ‘reality show.’ ”

Here, a gloss on Baudrillard's more obscure word-choice proves illuminating: “Lyophilized, meaning ‘freeze-dried,’ seems an apt description of the responses I receive that day in Boston,” writes Hearn; “they are pre-set, freeze-dried presentations of self, molded by prior knowledge of the dictates of the reality television genre and deployed strategically to garner attention, and potentially, profit.”

Abstracts for the Chicago conference are welcome through August 1. Showalter tells me he is receiving no academic credit for the undertaking, which has a shoestring budget. He received $2580 from The Uncommon Fund, a student-run initiative at U of C to “support creative ideas that may otherwise not be implemented at all.” Various academic departments have made verbal commitments to lend modest support this fall, though the paperwork remains to be done.

Has anyone from “Jersey Shore”– whether in the cast or on the production crew – expressed any interest in the conference so far?

“I wish!” he answers. “It would be fascinating to get their perspectives on the conception and development of the show. I’d also like to hear their answers to some of the criticisms from Italian-American groups and from officials in New Jersey who complain that the cast members aren’t even from the area.” It turns out most of them are actually New Yorkers.

The show often generates an intense, even visceral, response. (I have never gotten through more than a few minutes of it, but did watch as the residents of South Park formed an alliance with Al Qaeda to drive out the Jerseyites who were invading their town.) Then again, any cultural phenomenon capable of generating both strong negative affect and a tremendous revenue stream may prove “good to think with,” to borrow Claude Levi-Strauss’s phrase.

“If anything,” Showalter told me, “the vehemence aimed at ‘Jersey Shore’ has only made me more interested in watching the show closely. I've also enjoyed observing the cast members strike out beyond the series into other markets and products. I think Snooki's novel, A Shore Thing, is a great example of this; what appears at first to be a purely empty money-maker actually contains a rather complex and frenetic plot line, not to mention all kinds of subliminal autocriticism from Snooki herself. The universe of endorsements and branded products that has grown up around ‘Jersey Shore’ has made it a much more rich and engaging phenomenon.”

During an interview with the Maroon, U of C’s student paper, Showalter noted “the danger of people just taking Pop-Culture Phenomenon X and Obscure Author Y and trying to combine them together.” So far, abstracts for papers have been submitted by scholars working in English, media studies, sociology, and gender studies.

“There’s been nothing on issues of ethnicity and race,” he told me, “which is really surprising.” It certainly is. If anything, it seems like the topic for a whole panel. In an oft-cited remark, “Jersey Shore” cast member and literateur Snooki has stated, “I’m not white…[I’m] tan.” Discuss.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

End Large Conferences

I’ll play Marc Antony. I have not come to praise large conferences, but to bury them. It is my opinion that mega humanities conferences are way past their sell-by date. For senior faculty the only reason to go is to schmooze with old friends; for junior faculty they are an onerous duty, and for graduate students they are a rip-off for which professional organizations ought to be collectively ashamed.

First codicil: I speak exclusively of humanities conferences, as they are the only ones I know firsthand. Friends in computing and the sciences tell me that collaborative efforts arise from their conferences. I’m willing to believe them. Maybe it’s a cultural thing. Most humanities people find it so hard to collaborate that their wills stipulate that their notes go with them to the grave.

Second codicil: I have only myself to blame for recent travails. I didn't need to go to my unnamed conference, but I got it into my head that it would be fun. I was wrong. It serves me right for violating my principles.

Five years ago I concluded that humanities conferences were out of touch with the times and vowed to attend only smaller regional meetings with less cachet, but more satisfaction. But I didn’t listen to me. Instead I spent four days and a considerable wad of cash jostling among a throng of over three thousand. I returned home more akin to Ponce de Leon, who sought the Fountain of Youth and found mostly dismal swampland. Sound harsh? See if any of these observations resonate with your own.

Problem One: Outmoded Presentations

We live in the communications age, but the memo apparently never circulated among those studying the liberal arts. For reasons arcane and mysterious, humanities scholars still read papers. That’s tedious enough at a small conference where one might attend six three-paper presentations. At my recent conference, sessions commenced at 8 a.m. and ran past 10 p.m. One could have conceivably attended 30 sessions and heard 90 or more papers, though the only ones with the stamina to attend more than six or seven sessions were either posturing or desperate.

I wanted my four-day sojourn to introduce me to new ideas, concepts, and teaching modules, but the reality of such a grueling schedule is that I was running on fumes by the end of day one. It would have helped if presenters took advantage of new technology, but things seldom got more flash than PowerPoint, a program that, alas, seems to encourage more reading. Let me reiterate something I’ve said for years: the death penalty should apply to those who read anything from a PowerPoint slide other than a direct quote. It's an academic conference, for crying out loud; assume your audience is reasonably proficient at reading! Seriously, does anyone need to fly across the country to listen to a paper? Why not do as science conferences have done for years: post papers online and gather to have a serious discussion of those papers?

The mind-numbing tedium of being read to for four days is exacerbated by the fact that many humanities scholars have little idea about the differences between hearing and reading. If you construct a paper that’s so highly nuanced that understanding it rests upon subtle turns of phrase or complicated linguistic shifts, do not look up from your paper with a wan smile indicating you are enamored of your own cleverness; go back to your room and rewrite the damn thing. Audience, clarity, and coherence are pretty much the Big Three for speech and composition, unless one's audience is the International Mindreaders' Society. By the way, is there something wrong with using a map, providing a chart, or summarizing a work that few in the room are likely to have read? And do bother to tell me why your paper matters.

I actually heard several very exciting papers, but most of the offerings were dreadful. Note to young scholars: stop relying on the Internet and check out journals that predate 1995 before you proclaim a “discovery.” And if you really want to stand out, work on your shtick. Guess which papers I remember? Yep -- those in which the presenter did more than read to me.

Critical note to young scholars: Want to turn off everyone in the room? Be the person who doesn’t think that the 20-minute limit applies to you. Nothing says "non-collegial" more clearly.

Problem Two: Expense

Another reason to rethink conferences is that they cost an arm and a leg to attend. I had partial funding from my university because I was presenting -- and no, I bloody well did not read my paper -- but I was still out of pocket for quite a hunk of cash. If you attend a humanities conference and want to stay anywhere near the actual site of the event, plan on $150 per night for lodging in a soulless franchise hotel with windowless conference rooms and quirky technology, $20 per day for Internet access, another $200 for conference fees, roughly $500 for airfare, at least $50 for taxis to and from the airport -- almost no U.S. city has a convenient shuttle service anymore -- and money for whatever you plan on eating.

Budget plenty for the latter if your conference is in what is glibly called a Destination City. That’s shorthand for a theme area marketing itself as unique, though it’s actually a slice of Generica surrounded by shops and restaurants identical to those found in suburban malls in every way except one: captive audiences equal higher prices. (One small example: the Starbucks inside the pedestrian precinct at my hotel charged a buck more per cup than the one on the street 100 yards away.) Do the math and you can see that you can easily drop a few grand on a megaconference. (That’s what some adjuncts are paid per course!)

An immediate cost-saving adjustment would be to confine conferences to airline hub cities such as New York, Chicago, Los Angeles, Atlanta, and Houston. The moment the conference locates to a (not my term) "second-tier" city, allot another few hundred dollars for "connecting flights," a term used by the airline industry because it sounds nicer than saying you’ll spend six hours waiting in a hub, after you’ve sprinted through the airport like Usain Bolt for your next flight, found the gate closed, and retreated to the rebooking counter.

Problem Three: Victimized Grad Students

I'm a parsimonious Scot who resents spending money on boring hotels and lousy food, but I can afford it when I have to. Grad students can’t. A major way in which megaconferences have changed in the past several decades is that there’s considerably less balance between senior scholars, junior colleagues, and graduate students. (Senior scholars used to accompany the latter two in a mentor capacity.) Now there is just a smattering of senior and junior scholars, and they’re often holed up in hotel suites conducting interviews. Whenever they can, search committee members flee the conference and rendezvous with old friends. They might attend a session or two. Unless they have to be there, there aren’t many junior colleagues in attendance at all because they're busy getting material into publication and they can meet presentation expectations at cheaper regional meetings, or save their dough and go to prestigious (-sounding) international gatherings.

So who’s left? Graduate students. Lots of graduate students. So many that conservationists would recommend culling the herd if they were wild mustangs. Grad students have always gone to conferences in hopes of making their mark, attracting attention, and meeting people who can help them advance. That was the way it was done -- 20 years ago. Now network opportunities are slimmer. Whom do they meet? Mostly other grad students, often those massed outside of interview rooms.

Of all the antiquated things about large conferences, the "cattle call" interview is the most perverse. These were barbaric back in the days in which there were jobs; now they’re simply cruel. At least a third of attendees at my conference were grad students from a single discipline: English. As has been discussed many times on this site, most of them shouldn't be in grad school in the first place. How many of the thousand-plus English grad students can realistically hope to get an academic job of any sort?

The Modern Language Association predicts that only 900 English jobs will come open for all of 2011. That’s 900 in all specialties of English, the bulk of which will be in writing and rhetoric, not Austen and Proust. Will a fifth of those at the conference get a job? The odds are long. It's probably more like half of that, and if we're talking about a good job, slice it in half once more. So why ask strapped grad students to attend expensive conferences for 15-minute preliminary interviews? Do a telephone interview, for heaven’s sake; it’s kinder on both grad students and search committees.

As I did as a grad student, many young hopefuls pooled resources and economized where they could, but the sad truth is that the vast majority of attendees spent a small fortune on a gamble whose odds aren't much greater than buying lottery tickets. Are associations playing the role of enabler to grad student delusions? Yes. Here’s another thought: Instead of holding a big conference, sponsor a teleconference. Charge a fee for uploads, but give speakers one-year access to the URL, which they can make available to potential employers. Use the savings to the association to lobby for more tenure-track faculty.

Problem Four: No-Shows

You spend lots of money, you sit through desultory talks, and head off to the one or two sessions that made you want to attend the conference in the first place. What do you find? It’s been canceled because only one of the presenters showed up, and that paper was combined with several others of sessions that suffered the same fate. Didn’t you see the 3x5 card tacked to the conference bulletin board?

As noted above, I’m in favor of putting large conferences to rest. But If we insist on having them, let’s at least make sure they’re as advertised. O.K., things do happen, but in most cases missing presenters are simply AWOL. I know it smacks of McCarthyism, but I’ve come to support the idea of a data bank of no-shows that employers, conference planners, and deans can check.

Problem Five: Urban Sprawl

What’s the point of a conference that’s so big it’s inaccessible? I walked between two different hotels to attend sessions and pored over a Britannica-sized program to locate them. Conference attendees were housed in four "official" hotels and untold numbers of others. With round-the-clock sessions and decentralization, the few networking opportunities that did exist were logistically difficult. It took me two entire days to find my old friends, let alone new folks I wanted to engage. I met two interesting people at the airport. I never saw them again.

In Praise of Small Conferences

There are other problems I’ll leave for now, including the gnawing suspicion that some big conferences have become sinecures for "insiders" who have become "players" within associations. Let’s just say that there is a serious disconnect between how the big conferences operate and what makes sense in the changing world of academe.

Teleconferences with real-time discussion groups and online forums would be one good starting point for reform; providing more resources for regional and local conferences would be another. Small gatherings have issues of their own -- no-shows, sparsely attended sessions, overreliance on volunteers -- but they compensate by offering intimacy, good value, face-to-face feedback, and easier opportunities to network. It's time to give these the cachet they deserve. The big conference is like a one-size-fits-all t-shirt; it simply doesn’t accessorize most people. I’m done. For real. Unless I get funding for an exotic overseas meeting. (Just kidding!)

Author/s: 
Rob Weir
Author's email: 
info@insidehighered.com

Rob Weir, who writes the "Instant Mentor" column for Inside Higher Ed's Career Advice section, has published six books and numerous articles on social and cultural history, and has been cited for excellence in teaching on numerous occasions during his 20 years in the college classroom.

Why I Am Not Radical Enough

As a teacher of rhetorical studies, I've been trained to think about the differences between audiences and how to adapt one's messages to address those differences. Of course, having earned one's credentials in "the art of persuasion" and (presumably) possessing the intellectual tools of audience adaptation doesn't necessarily mean one can do it well, and last fall I really stepped in it. What have I learned? Sometimes it is permissible to retreat from a more straightforward -- if not radical -- introduction to queer theory to a classic, liberal politics of toleration or humanism when teaching undergraduates because we no longer live in an environment that protects academic freedom. Although Kurt Cobain did once sing, "what else should I say/everyone is gay," sometimes students are not ready to interrogate what that means, and they'll make their parents call deans and chairs attempting to get you fired if you try to teach them.

Here's the set-up: For three years I worked as an assistant professor at  Louisiana State University in Baton Rouge. Having moved there from the University of Minnesota (where I did my graduate work), adapting to Louisiana students took some time, and the culture shock I experienced was intense. Gradually I acclimated to the sight of public, drunken nudity and that charming, Southern hostility toward my so-called Midwestern political correctness. My experiences in Louisiana taught me that although the students claimed a conservative, religious politics, they were quite familiar and accepting of "alternative lifestyles," and I often had to resort to pretty wild examples in the classroom to keep their attention and to get them to engage queer theory beyond the level of "whatever!" and "so what?"

Friends and colleagues were often surprised when I told them that my students took to "controversial" theoretical perspectives, such as the critical work of Judith Butler on gender and sexual identity, quite well. One semester -- as I was teaching the Kinsey scale to supplement Laura Mulvey's theory of cinematic pleasure - -I just asked my students: "Y'all don't seem too bothered by this material; why is that?" One of my repeat-students said in a sardonic tone, "Dude: Mardi Gras?" My Louisiana students had "seen it all," and probably from a very young age many of them learned how to hang up their hetero-hang-ups, at least for a week or two before Ash Wednesday and Lent so that they could properly enjoy all the parades and street parties.

Obviously, I had a lot of adapting to do when I took my second job, at the University of Texas at Austin. I still do not have a good "feel" for the students at my new university, but I think in general it is fair to describe the students here as more right-leaning politically and more conservative in their thinking about lifestyle. Regardless, to my delight and horror, as I began teaching the queer theory unit of my Rhetoric and Popular Music course I heard the same wild examples exiting my mouth in seemingly automatic fits of charismatic teach-o-mania. I still assigned readings like Cynthia Fuchs' fabulous essay on queercore, "If I Had a Dick: Queers, Punks, and Alternative Acts." But I quickly learned that when one combines reading material that attempts to unravel binaries and my own ambiguously (and strategically) queer teaching persona in a "Bush Country" classroom, one should expect a little hostility. I expected it, really I did. I simply did not expect to catch hell from a parent.

The day after I lectured on heterosexist norms in heavy metal music videos, I was summoned to the principal's office to get a talking to. Apparently a student's mother was among the sea of faces in my large lecture class that day, and was expressly appalled at my queer "agenda." In an email that my chair shared with me, the mom said that it was obvious I was attracted to both men and women and therefore "no one is safe." For the class I developed a field trip ethnography project at a well-known, Austin 18-and-up punk club. This parent said that I forced my students to go to a "gay bar." Ultimately, I was characterized as unprofessional, as teaching filth, and as trying to recruit students for the "gay cause."

Needless to say, my meeting with the chair was painful and I was fearful, although one couldn't blame him. He did the best "you have academic freedom, but" talk I've yet to hear. Even so, I was told the story about "that professor" who was fired from such and such a department for "creating a hostile classroom environment." I was told to de-personalize my teaching and reminded that I did not have tenure yet and that teaching evaluations were very important to the tenure review process.

Since that meeting I have changed my teaching a bit and am more mindful of the power of students and parents have to take out an assistant professor whom they do not like, especially under the aegis of sexual harassment. Us juniors should also remember that many of our deans are (necessarily) insulated from the classroom and by force of situation are often more sympathetic to students and parents in our age of the "cultural wars" and "zero tolerance."

Immediately after the incident, I was worried about protecting my teaching assistants. One of them was slated to deliver a lecture on the interchangeability of sex organs in the music and art of Peaches, a controversial and polyamorous figure who had an underground dance hit with "Shake Yer Dix (Shake Yer Tits)." Although I knew I was a bit oversensitive after the talking-to with my chair, I decided to send a preemptive e-mail message to the 130-student class in an effort to spare us more grief. Here is the text of that message, edited to protect the innocent and please the legal eagles:

Greetings Class,

Your resident instructor here with some background commentary on your readings for Tuesday, as they directly challenge cultural assumptions of “normalcy.” We will be discussing the field of “queer theory,” which grew out of the heated discussions of feminism in the 1980s and 1990s regarding sexual desire and the relationship between social identity and biology. We’ll spend some time discussing the term “queer” itself -- which is confusing--but for the moment let us simplify a lot of the concern of queer theory to a series of questions: to what extent does biology and genetics form a materialist basis for gender and sexual identity? In other words, are we born gay, straight, or somewhere between those two poles? Where do the chemicals and biological predispositions end and where does culture begin? Why is sexual identity such an obsession in the United States (e.g., what’s the big deal about the proposed Texas amendment to ban gay marriage)? Finally, why are we so interested as a culture in these questions?

The latter question may resonate somewhat. To put it like my own granny does, “who gives a d*&! what you do in the privacy of your own home?” Or to reduce it to a question I received some years ago from a student, “who cares?”The answer to the last question is this: if you identify as traditionally masculine or feminine or “straight,” for whatever the reason, you have a much easier time in our society that if you do not. Sometimes having someone broadcast their sexual identity in your face gets tiresome. My point, though, is this: If you were deemed socially “abnormal,” it hurts, and it can be empowering to say, unabashedly and unashamedly, “this is me!”

Indeed, not being “normal” in any respect first leads to torment (think back to your own experiences in middle school, hey?), and then ridicule and rejection. The big problem is that being different can get you killed (e.g., Matthew Sheppard, Tina Brandon, hundreds of thousands of folks without white skin, Jews ... Jesus, alas, we are not wont for examples in history). So the answer to the question “who cares?” is “those folks who are more likely to suffer," as well as the people who love them. Although you might think you are pained reading this stuff, feminism and queer theory are really about ending human suffering. That’s really what it comes down to folks: people suffer and die because they are “different.” If there is a tacit ethical teaching to this literature, it is the lesson of tolerance.

Feminism and queer theory concern thinking about ways to keep people from getting hurt because they are not what society deems “normal” in regard to their gender and their sexual desire. Millions of folks live realities that are fraught with pain and hardship, and only because they harbor a preference for someone of the same gender or sex (or of a different race, and so on). As we saw with heavy metal, popular music practices are a central way in which these issues are expressed and negotiated in our culture. For reasons we discussed with Attali and Adorno (the irreducible humanness of music, that “noise” factor), as a powerful form of human expression, music can be used to create a kind of force field for expressing, deconstructing, constructing, and establishing a gamut of identities. Music, in other words, can unsettle our gendered and sexual identities (e.g., glam rock; queercore) as much as it cam reestablish or reinforce them (e.g., Enya; Nas).

As we tread into this territory I need to underscore a few things about the ultimate purpose for assigning this material. Although it may appear at times your goodly instructor is endorsing or promoting this or that approach, requiring readings and lecturing on queer theory is not to be taken as an ENDORSEMENT or propaganda for joining the some sort of Gay Borg or ominous Lesbo Deathstar (nor does lecturing on materialism entreat you become a socialist). Exposing you to this material, or any discussion of non-straight sexual identity, is not designed to “convert” you; it’s not, in other words, sermonic. Rather, it’s functionally informative AND designed to challenge settled, “normal” beliefs about what is and isn’t appropriate in our society (indeed, what is or is not appropriate to discuss in the classroom!). You can think about it this way: the classroom should be the opposite of the church, synagogue, or mosque. In class, we challenge our settled ideas about normalcy and look beyond deity or the physical sciences for alternative explanations for social practices. In the house of God, we reaffirm and reestablish our settled ideas and beliefs. And in some ways, you cannot have the latter without the former.

Finally, I recognize this message is crafted for a “straight” audience, so let me give a shout-out to those among you who are forced to switch codes in the classroom (which, as you well know, is also almost always oriented to the “hetero” world): if you do not identify as “normal,” welcome. I hope the readings and lectures on identity -- sex, gender, and sexual orientation -- are affirming and that classroom is a safe space in which you see your reality represented.

Now, mindful of the audience of Inside Higher Ed, I needn't detail at any length why this e-mail makes me cringe. It represents my frame of mind, worried about student hostility toward my assistants and (however unrealistically) worried about losing my job. I shared my e-mail message with colleagues, and my friend Ken Rufo  detailed the teaching pickle it created better than I can:

Here’s a philosophy-of-pedagogy question, one that I confront quite a bit, and am always unsure about negotiating. The letter . . . indicates the problems with difference from a fairly conventional, liberal perspective. But this conceptualization of difference isn’t exactly simpatico with a lot of [the theory you teach and publish].... you’ve invested a lot of time and effort making a case for [the value of psychoanalytic theory] to rhetorical studies, and so I wonder how you negotiate the complexities of a certain worldview with the necessities of teaching, or if you feel any tension there at all?

What my e-mail does, in other words, is reestablish the same liberal-humanist politics of toleration that a lot of queer theory tries to challenge and dismantle: What if there is no common humanness to us?  What if this binary logic of same and different is a causal factor in homophobic violence? Aren't these sorts of questions the kind posed by the thinkers we are reading for class?

After I posted the email to the class and talked to my friends about it, I decided I would simply address the issue directly in class, turning the e-mail into a teaching exemplar. Before I could lecture about the e-mail, however, my teaching assistant lectured on Peaches, and she received a standing ovation when she finished. That reaction told me that perhaps the e-mail had a positive effect. On the following teaching day, I asked the students to bring a copy of my e-mail message to class, and we went through it together and we discussed why it was a problematic message, locating binaries and troublesome assumptions. In my mind, this was the best way to "recover" an important teaching of queer theory while, at the same time, eating my cake too.

I cannot say that going over the e-mail helped most of my students understand the problem with liberal humanist approaches to identity. Some of them understood what I meant when I confessed that I "retreated to humanism," while others clung tightly to their notions of a universal equality rooted in phallogocentrism. Nevertheless, I'm coming to the position that I should send variations of this e-mail to my class every time I teach queer theory. I feel slightly dirty doing it because the move represents a bait-and-switch pedagogy, but it may be the best way for me to adapt to my Texan classroom while retaining my tendency to personalize theory. I guess, then, I'm not radical enough. But I want to keep my job.

I'll admit as well that deep down there is a part of me that cannot let go of the notion that liberal humanism keeps some people alive -- a faith I'd like to think has some affinity to Spivak's notion of momentary solidarity in "strategic essentialism" for social and political action. I say I'd like to think it has an affinity, but perhaps I'm more sheepish and cowardly than I'd like to admit? Nevertheless, institutional pressures, the increasing erosion of academic freedom and the decay of tenure protections, the general, cultural hostility toward the professoriate, parental and alumni demands and the PTA-ification of the college and university, and the consumerist drive-thru window attitude about teaching that some students harbor, these trends collectively suggest that the teach-it-and-then-deconstruct-it approach may be the baby bear's porridge pedagogy of our time.

Author/s: 
Joshua Gunn
Author's email: 
info@insidehighered.com

Joshua Gunn is assistant professor of communication studies at the University of Texas at Austin.

Multiculturalism, Universalism, and the 21st Century Academy

The following essay was adapted from the author's keynote address at for the Future of Minority Studies Summer Institute Colloquium, at Stanford University last month. Last week, Scott McLemee explored the colloquium in Intellectual Affairs.

Preamble: What Keeps Chancellors Up at Night?

Two years ago I attended a conference of presidents in which among the many panel discussions on American Competitiveness (“The World is Flat” ), Federal Science Funding, The Future of the Humanities, and the like, was one panel entitled: “What Keeps Presidents and Chancellors Up at Night?” Expecting to hear a great deal about the arms race in intercollegiate athletics -- absolutely a genuine concern -- I was rather surprised to hear instead about multiculturalism and what might be called its associated “culture wars.”

Of course, I shouldn’t have been surprised, as there had been so many high profile examples, from the public’s reaction to the University of North Carolina at Chapel Hill assigning the Qur’an as its first year shared reading to the media coverage of strife in Middle East studies at Columbia University. Moreover, I had just spent six years defending affirmative action at Michigan and three years in the midst of debates at Illinois on the campus mascot, Chief Illiniwek. Anyone in these positions long enough knows well that universities are like sponges for society’s tensions and that one way or another something will erupt on every campus that reflects the fraying of multicultural community and the state of “civil” society.

Whether it is in athletics or the student media, in the classroom or in campus organizations, tensions over religion, race, ethnicity, and sexuality, are powder kegs on our multicultural campuses -- as they are of course in our cities and towns. As one of my colleagues noted, conflicts, such as occurred at Duke recently, can happen on any one of our campuses in one form or another. At Syracuse, for example, we are overcoming the impact on our campus of the production of an entertainment television show, by a student-run station, that used caricatures of various groups as “humor.” As at Duke, when we go beyond finger pointing, these incidents alert us to our communal responsibilities, and to the work still to be done on our campuses and in our connected communities.

For not being surprised doesn’t mean we can stop talking about it. There is a crying need to take these kinds of incidents -- and they are indeed widespread -- seriously as symptoms of a society that is not comfortable with pluralism. I suggest that we address this state of affairs with the same deep thinking that we give to understanding how to respond to our increasingly “flat world,” for it is as much in our national interest. In fact, I suggest that thoughtful analyses of group dynamics and communal responsibility in a diverse society may actually help us better face the “flat world.” Instead of competitively fighting between ourselves for a shrinking piece of the pie -- whether in higher education or in our connected communities -- shouldn’t we learn to live and work together and find innovations that enlarge the pie? Wouldn’t that get us closer to fulfilling the agenda of universal human rights that lies at the foundation of a just and effective society?

Taking Groups Seriously

Many people’s reaction to these “culture wars” is to suggest that we all just turn our backs on groups altogether -- as when people call for a color-blind or culture-blind or gender-blind society. Not only do I see this as naïve (in the face of pervasive group dynamics and tensions), but also as missing the constructive role that groups must play in promoting a social justice agenda and building an effective multicultural community. Taking groups seriously can be constructive both for those who are on the “outside” trying to get in to a particular community and for those who are more securely established as insiders. This is especially true in a world full of insiders and outsiders -- and we all occupy both positions -- in which as outsiders we could benefit from seeing more personal possibilities (on the inside) and as insiders we could contribute by taking more social responsibility (for those outside). And, like it or not, we need to build effective multicultural communities to be competitive and just, so we better start taking groups seriously.

We first need to recognize some “facts” of social life and the pervasive disparities in our pluralistic, insider-outsider world, and find an avenue to constructively confront them. Here is where it helps to know something about the psychology of multiculturalism (and of insiders and outsiders) and to work with it, rather than remain oblivious to its powerful impact. For, in the midst of this fraying of community, and widening of the gap between those who belong and those who don’t, it is easy to miss the fundamental interdependence of individuals and community. Easy to miss the truth in the oft repeated notion that if we don’t all hang together we will all hang separately.

So, in the hopes of starting this discussion, I turn now, as a social psychologist and educator, but also as a chancellor in charge of a multicultural campus community, to consider why and how we go wrong in our group dynamics, and what we might do differently to face our challenges head on.

The Social Embedding of Individuality

To see how the social embedding of individual human potential -- which I will abbreviate from now on as “individuality” -- works, it is important to start from the premise that self-construals -- who we think we are and what we see as possible for our selves -- matter. But, we do not think about our selves in a social vacuum, either.

Our self-construals are embedded within and shaped by critical cultural practices and social organizations that constitute a matrix of opportunities and constraints in our daily lives. Over the long course of history, for example, numerous different cultures and societies have expressed more concern about the educational and career paths of boys than girls.

These self-construals are also embedded in a matrix of critical interpersonal relations through which we garner diagnostic input from other people about our selves. Other people serve as sources of social comparison, including those whom we take on as models or idols. Importantly, other people play a fundamental role in legitimating our selves -- as we are now and might possibly become -- especially those with some power over us, but also sometimes those peers who provide consensus information about similar experiences.

Social group memberships, particularly those organized around gender, race/ethnicity, religion, sexuality, disability, and nationality, constitute critical influences in most cultures on both the matrix of opportunities and constraints and the input received from others. Of course, individuals personalize their social identities (contrary to an essentialist view of identity politics), by accepting or rejecting group-based constraints and feedback, but nevertheless, their impact is pervasive.

Claude Steele’s elegant demonstrations of stereotype vulnerability document the pervasiveness of these group-based dynamics. For example, as he has shown in laboratory experiments at Stanford, the performance of high achieving women students, including those who consider themselves as analytically smart, can be undermined by simply and subtly invoking gender stereotypes with an off-hand comment about the test measuring analytic ability. There is nothing overt or “in your face” about these experimental manipulations, and certainly nothing that should over-ride a student’s own acknowledged individual performance history. Yet, it is hard to act as an individual, when the “group” lurks in the background.

And beyond the laboratory, our groups often don’t just lurk quietly in the background. This is a media culture in which there is relatively constant attention to and (perhaps inadvertent) promotion of group-based stereotypes of all sorts, in the sports and entertainment arenas, in politics, and, yes, even in the academy. Consider, for example, the flood of media coverage after Larry Summers questioned the capacity of women and girls to be stars in science and mathematics. Even, as in his case, when the marketing of group-based stereotypes comes unintentionally, those who are “marked” by highly visible and/or contested identities find them hard to ignore. Few women scientists had a choice of whether to be scrutinized under those conditions -- their individuality was swept into a tidal pool of issues defined by their “group.”  

“Insiders” and “Outsiders” and the Social Embedding of Individuality

However, the social embedding of individuality varies importantly as a function of the “location” of one’s significant groups -- with respect to status, security, and power -- in a particular community. Those whose groups are less well-entrenched in a community -- “outsiders”  -- will be more marked by and connected to their group(s) than will “insiders.” By contrast “insiders” operate more easily as “individuals” and feel both less connection to and less identified by their groups.

In turn, this different psychology of insiders and outsiders is readily apparent in different attitudes toward communal responsibility in a diverse and multicultural community. That is, as insiders, we take a great deal, cognitively and socially, for granted in daily life. We engage in cognitive egocentrism, using, for example, our own experience and assumptions as a road-map for making judgments about others, rarely taking into account that they may be operating with a different matrix of opportunities and constraints, and with less of a sense of individuality.

Most specifically, we underplay the level of scrutiny and constraint that is felt by an outsider when his or her group is even subtly or minimally invoked, not to mention derided. The degree to which outsiders’ identities are wrapped up in their group(s) seems almost irrational to an insider, prompting them to question the authenticity of outsider reactions. Frequently, for example, an outsider will be described as “over-reacting,” or being too “pc.” It is extremely difficult for an insider to imagine their individuality so intertwined with their group(s). They simply don’t live a life of “guilt by group association,” and so they are skeptical of and not particularly empathetic to those who do. In turn, by failing to recognize these constraints on individuality and on the freedom to dissociate from the group, insiders miss a lot about the social life of outsiders, and this is a critical impediment to interpersonal trust.

By contrast, the psychology of the insider at least with respect to his or her “visible” groups -- such as race or ethnicity or gender -- is much less explicit or “marked.” For the insider, groups are more about voluntary association, such that they can be held at an “arms length,” especially if something goes wrong. Since, as insiders, we each view ourselves largely as individual actors, it is relatively easy, in good conscience, to distance from the group’s mistakes or the culture of an organization. There is little or no “guilt by group association.” Others may have made a mistake, but “if I didn’t touch it, I didn’t do anything.” Moreover, the insider remains ever on guard against any ill-informed accusations that would implicate him or her in some unfair guilt by association with the (mistakes of others in the) group.

This psychology is, of course, perfectly rational and  fair from an individualistic perspective, but not terribly good for building a community in which only some people feel disproportionately “marked” by their groups, unable to just walk away. Surely, we all want to avoid unfair individual blame, but at the same time we should feel some communal responsibility when an organization or group to which we belong ends up hurting others. This should be the case even when no harm was intended and you can’t imagine why they are hurt. This “arms length” relationship to group behavior is another critical impediment to facilitating a broad sense of fairness and interdependence in a diverse community.

“Epistemic Privilege” of the Outsider

While the insider’s gaze is generally away from the group, the outsider instead looks right at it with, what Satya Mohanty and others refer to as the “epistemic privilege of the oppressed.” Outsiders typically see how their group marks them, and how therefore social location matters for what they can do and how they can expect to be treated. Largely, this clarity of vision comes from being in a perpetual state of guardedness and uncertainty, examining the social landscape, always prepared for some group-based challenge.

By contrast, the challenges faced as an insider come less routinely, and relate more to individual comparisons or interactions, one on one, with peers, competitors, idols, and the like.  What insiders rarely face head on is some group-based challenge -- direct or subtle -- that they see as constraining who they are or what they (as individuals) can do.

In other words, the outsider lives with the discomfort of epistemic privilege and the insider lives with the comfort of cognitive egocentrism, often oblivious to the effects of social location on others. And, the epistemic privilege of the outsider does not raise the probability of being heard by the insider.

The outsider always has a “theory” about social location in need of some validation. Like any theory, there are multiple avenues for validation. The outsider can spend time with other group members, sharing experiences and insights that provide some validation by consensus. Many of us remember the “consciousness raising” groups of the women’s movement as just such experiences. And we see powerful examples of the importance of consensus information in group affirmation all the time, including, for example, the social support that junior faculty give each other, the importance of professional identity group organizations (such as black journalists or women engineers), and the theme houses on college campuses.

These consensus-building experiences are very important and should never be under-estimated as part of the constructive role that groups can play when we take them seriously. However, precisely because the insiders in the community will likely remain blind to or skeptical of the conclusions of such discussions, other avenues of validation are needed. The outsider needs to be heard beyond the group, and the insider needs to listen to other groups.

How do we create a context for such inter-group dialogue in which the guardedness of the outsider can lessen and the insider can go beyond the egocentrism of individuality. As insiders, we each can listen -- and move toward communal responsibility -- when we get past an individualized framework to see the powerful role of groups in social life. When insiders begin to acknowledge that outsiders have little or no choice but to be seen through their groups then suspicion often evaporates, and the potential for collaboration and community grows. This is when multicultural education is at its best, and when colleges and universities can play a very constructive role in turning the tables of epistemic privilege.

In this regard, it is worth repeating that contrary to an essentialist version of identity-politics, we are all both insiders and outsiders in our lives. That is, the experiences of group-based vulnerability, on one hand, and individuality, on the other, are shared, even if they are distributed differently for different groups or individuals. This is not to say that some dimensions of social organization, such as race/ethnicity or gender in our society, don’t powerfully tip the scale toward constraint over opportunity, group over individual. It is simply to say that the ground is ripe, even for those frequently on the inside, to engage attention to social inequality, in part by turning the tables on whose insights matter and who is listening.

Giving Voice to Outsiders and Asking Insiders to Listen

But, how do we do this in the midst of inter-group competition and suspicion? How do we do it when our campuses and our communities more broadly are quite divided, with many insiders and outsiders, and two strikingly different psychologies about group life?

I would point to two types of multicultural “projects” that can help bridge these two psychologies, while also creating more educational opportunity and more scholarly innovations that matter to the world. One project is internally-focused on constructing opportunities for intra- and inter-group dialogue that capitalize on the relevance of group-based vulnerabilities for virtually everyone. The other project is outwardly focused on connecting the campus -- and its diverse group of scholars and students -- to our broader communities, capitalizing in that case on faculty interest in public scholarship and students’ interests in volunteerism. In each project, however, the central ingredient to success will be to take multicultural groups seriously, unpacking rather than covering up disparities in voice and opportunity and building communal responsibility.

As to the “internal” project of facilitating intra- and inter-group dialogue that address social inequalities head on, this work is, of course, at the core of the expertise of those gathered here and central to the agenda of the Future of Minorities Studies. In this work, and I would point to the curriculum developed at the University of Michigan by Patricia Gurin and her colleagues as a prototype -- there is a commitment to exposing inter-group inequality through group-based experiences that individuals can share. So, for example, women in a dialogue on gender might find consensus support for their experience of not always being listened to by men. At the same time, the men in the group might begin to listen to these observations and take them seriously, even if they believe there was no “intent” to discriminate. Sometimes, the tables turn in a dialogue, so that the experience of being “marked” by one’s group can be felt even by those who more often than not operate with more individuality in their lives. These moments of “epistemic privilege” for the insider -- when our own group-based vulnerability intersects with the consensually expressed views of the outsiders -- can make us more receptive to seeing the situation of outsiders in a new and more empathic light. When the tables turn, common ground, respect and shared responsibility emerge.

At that point, it is also critical to relate these personal experiences to the pervasive social inequalities that attach to some groups -- and therefore to their members -- in particularly powerful ways in our society, and therefore also on our campuses. Through this mixture of the personal and the general, in narratives and in empirical work, it is possible to begin to unpack how for some people, there is often “guilt by group association,” whereas for others, communal responsibility is easy to keep at “arm’s length.”

To make a real difference, however, these dialogues on the power of groups and the effects of social location -- the different psychologies of outsiders and insiders -- must reach far across a campus. While there is little doubt that some group-based vulnerabilities are more pernicious and pervasive than others -- and certainly race and ethnicity, gender, sexuality, and disability fall in this category -- the framework here can be applied broadly and in helpful ways. Many campuses, for example, worry about the kinds of mentoring given to their junior faculty -- in whom they have a substantial investment for the future. I would suggest that this same analysis can be applied constructively to the experiences of untenured versus tenured faculty, and especially if at the same time one considers the issues confronting women and junior faculty of color. Taking this approach one step further, I believe that academic leaders -- including chancellors, deans and department chairs -- can profit from a better understanding of the outsider experiences of particular groups of faculty, staff, and students, and particular disciplines, such as minority studies, for example. It is not at all uncommon on campuses to see the tell-tale signs of insiders and outsiders, each with “good intentions,” talking past each other -- operating with different expectations from different psychologies. We can do something about this if we take on this multicultural campus project.

Connecting to Communities and Turning the Epistemic Table

The complementary project that I see for universities is an external one, in which we forge outward-looking connections to diverse communities, working on the pressing issues of our times -- from failing schools to environmental degradation to inter-religious conflict.

When universities start collaborating with their connected communities (at home and abroad) on the most pressing issues of the day, I have seen the tables turn in ways that  benefit both our innovations and the quality of our multicultural community. Why does this happen? I believe the answer lies first in the nature of the problems to be solved now and the connected question of who becomes the expert. It is hard, for example, to make progress on environmental sustainability in an urban ecosystem without addressing questions of environmental justice, and whose voice do we need to listen to in that case? How do we tackle the urban epidemic of diabetes, even if we develop a better understanding through genomics of the disease itself, without contextualizing its spread within the broader questions of race disparities in health? Wouldn’t we understand the genesis of inter-religious conflict better if we engaged with refugee communities in our own cities and towns? It is virtually impossible to find a problem of major importance to our society in which the insights of a diverse, multicultural community would not be very valuable to the solutions.

Additionally, there is a growing cadre of faculty -- including many women and faculty of color -- extending well beyond the social sciences into the arts, humanities, sciences and professions, who are increasingly doing scholarly work that matters to communities. This engagement can also capitalize on the robust presence of service-learning curriculum and volunteerism on campuses. For oddly, interest in service-learning and volunteerism is very high, despite the individualism and detachment, even communal “irresponsibility,” that I described earlier. This engagement of students and faculty in community-based work, and work around the world, can provide a launching pad for sustained attention to questions of social inequality and multicultural community.

It also does something else dramatic. It turns the tables on who has voice, and who can benefit by listening. It reverses roles and the epistemic privilege -- perhaps even its enlightening discomfort -- spreads to a different set of actors. As George Sanchez has suggested, those who often feel relegated to the outside of our campus communities, such as faculty and students of color, emerge with more expertise and authentic voice in this agenda, as they often begin with more “standing” in the surrounding community and on the issues at hand. The social/academic landscape begins to change when the insights of outsiders -- either from the community outside or on the academic margins -- begin to be heard.

This reversal of perspective (or social location) not only prepares everyone for doing the work of the nation, but as importantly it shines some light on inequality. It shows both the strength of diverse groups and cultures and constraints on them. In turn, this is a lesson with powerful ramifications back on campus. As we engage with our communities, we also recognize the stresses of the broader world as they are “brought to” the campus, and then feel some fundamental responsibility to address them as part of building a productive campus community.

Rewarding Scholarship in Action

And when we take that responsibility seriously, then new scholarly and educational vistas open too. At Syracuse, for example, our academic vision is based on the notion of "Scholarship in Action," where interdisciplinary teams of faculty and students engage with communities of experts on issues that matter, such as disabilities, shrinking cities, failing schools, neighborhood entrepreneurship, religious pluralism, or environmental sustainability and the urban ecosystem.

These collaborations, like our Partnership for Better Education with the Syracuse City Schools, create a shared mission that breaks down barriers, between campus and community, and embeds the traditional diversity agenda within the academic work of the institution, and in turn embeds that work in the public good.

To make the Scholarship in Action agenda work, however, we must change our reward structure for faculty who do this collaborative work. We must, for example, support faculty members who want to do public scholarship, with results that may be published in academic, peer-reviewed journals, but may also result in network news specials, digital modules for public libraries, or museum exhibitions. We must find the right incentives for a diverse faculty to engage with communities of experts on innovation that matters, and to that end, many institutions, including Syracuse, are re-evaluating their tenure and promotion criteria. A tenure-team initiative, organized by Imagining America: Artists and Scholars in Public Life, a 70-institution consortium, is gathering best practices on how to promote standards of excellence in public scholarship. Momentum is growing to take public scholarship seriously.

In my view, investing in excellence in public scholarship in our multicultural communities is a pathway toward bringing questions of diversity and diverse students and faculty from the margins of our institution to the center. As we work on innovation that matters -- from the science needed to remediate environmental pollution in our cities and waterways to the art that gives voice to refugees resettling in America -- we learn to value diversity and the insights of diverse others. We also learn to listen harder to each other, dropping a bit of the egocentric covering of our own positions. We see the observations of our peers and colleagues within the broader social landscape in which they are shaped, and we take more responsibility for changing that landscape. We come to see that multicultural progress will be shared, but only if we also take groups seriously.

Multiculturalism, Universalism, and the Lessons of Citizenship

At the end of the day, the hope of these two kinds of projects -- internal multicultural dialogue and external multicultural collaboration -- is that we all come to value diverse groups, not just diverse individuals. We will do this by expanding the lesson of citizenship from one purely about individual rights to one about connectivity and responsibility -- and the social embedding of individuality. We’ll learn that we are all in this together, and we can’t just make creating opportunity someone else’s project. If this works, then I believe that, at least in this regard, presidents will sleep at night, and, more importantly, universities will make a difference in promoting social justice and universal human rights. 

Author/s: 
Nancy Cantor
Author's email: 
info@insidehighered.com

Nancy Cantor is chancellor of Syracuse University. Her keynote address in full is available online. A video of the address is available on the institute's Web site.

The Good, the Bad, and the Ugly

The last episode of the HBO series "Deadwood" ran on Sunday evening, bringing to an end one of the most unusual and absorbing experiments in historical storytelling ever attempted on the small screen. The network’s decision not to continue the program is understandable (it was very expensive to film) if by no means easy to forgive.

Set in a mining camp in the Dakota Territory during the late 1870s, "Deadwood" belongs to the sub-genre of the “revisionist Western” -- a skeptical retelling of how the frontier was settled, one grittier and less prone to melodrama than B-movie versions. Among the people finding their way into town are historical figures who have long since become part of the Western mythology: Wild Bill Hickok, Calamity Jane, the brothers Earp. Most of the other major characters can also be found in chronicles of the real-life town of Deadwood.

A few others were imagined into existence by David Milch, the show’s creator -- but not quite ex nihilo. I’m pretty sure that Alma Garrett, the genteel widow who sets up Deadwood’s bank, wandered into the show from one of Henry James’s notebooks. The refined sociopath Francis Wolcott -- the (fictional) geologist employed by the (very real) mining tycoon George Hearst -- might well have felt at home in William S. Burroughs’s transgressive Western novel The Place of Dead Roads.

And while the unctuous hotel proprietor and mayor E. B. Farnum is based on an actual person who lived in the South Dakota town, he also comes by way of Charles Dickens. E.B. is the American cousin of Uriah Heep, if ever there were one.

Such literary allusions might all exist solely in my imagination, of course. But probably not. Milch, the show’s executive producer and head writer, was a student of Robert Penn Warren and Cleanth Brooks at Yale University in the 1960s. Interviews reveal someone whose mind turns easily to questions of literary form and verbal texture.

The scripts Milch has written for television -- in the early years of "NYPD Blue," for example -- exhibit an interest in how a group of people who live and work together create an argot capable of infinite subtleties of inflection, depending on the circumstance. His years around Warren and Brooks (founding fathers of the old-fashioned New Criticism) must have drilled into Milch the idea that literary works are characterized by irony, tension, and paradox. He seems to have taken this insight to the next step -- listening for how those formal principles can shape the rhythms of ordinary conversation.  

With "Deadwood," the Milchian penchant for conveying the stylization of speech broke new ground -- thanks to HBO’s freedom from the conventional restraints of broadcast television. The characters delivered intricate arias of Victorian syntax and repetitive obscenity. It sounded like some hitherto unimaginable blend of Walter Pater and gangster rap. It was often exhilarating, if sometimes farfetched. You felt awe at the power of the actors to memorize their lines, let alone speak them. The combination lent itself to parody but it is difficult to imagine its like ever being heard on television again.

Milch’s tendency toward stylization bothered some people, who found it mannered and arch. I don’t agree, but will leave the show’s defense in more capable hands. Instead, let me use this chance to discuss another element of the language of "Deadwood" that has passed largely without comment, although it usually proves far more bracing than the familiar obscenities.

I mean the epithets. The women who work in the saloons of Deadwood are called “whores.” Nobody blinks at the word, least of all the women so addressed. The Sioux Indians are more often referred to as “dirt worshippers.” The town’s Chinese laborers live in “Chink Alley.” One of the owners of the hardware store is the entrepeneur Sol Star, better known simply as “the Jew.” (He teaches his girlfriend Trixie, a former whore, how to do bookkeeping. In moments of frustration she calls it a “Jew skill.”) A black drifter arrives in town wearing an old Civil War uniform. If he has a given name, it isn’t mentioned twice. Everyone refers to him as the Nigger General -- in part, because that is what he calls himself.

Often enough the words are used as weapons. But sometimes the insults flow so casually that the offense barely has time to register. And there are moments when they carry no more charge than a “damn” would. It is all a matter of context.

But it is a context in which racism, for example, is naked and unashamed. "Deadwood" takes this for granted as a fact about the world it is presenting -- a reality scarcely more worthy of comment than the mud in the streets.

One citizen of Deadwood in particular is prone to loud and resentment-fueled tirades about the honor that is due him as a white man. You see that most other characters find him disgusting. But that isn’t a matter of his attitudes, so much as his demeanor. After all, he is universally known as Steve the Drunk.

The language proves jarring -- for the television audience, anyway -- precisely because it is treated as ordinary. The  charge of symbolic violence can be taken for granted, just like the fistfight taking place out in the thoroughfare at two in the morning. Its cumulative effect is powerful and eye-opening. (Or maybe “ear-opening,” rather.)

While reading Eric Rauchway’s new bookBlessed Among Nations -- the subject of last week’s column – I found that the ambience of "Deadwood" was almost always at the back of my mind. But only after interviewing Rauchway did it occur to me to ask if he watched the program. Not surprisingly, he did. I asked if he had any thoughts on the show, now that it was winding down.        

“There's an overall story arc of the transition from wilderness to civilization,” he responded, “and the major plot lines have to do with the circumstances under which civic institutions evolve. But it's not Frederick Jackson Turner's frontier – or if it is, it's a decidedly modified Turner.”

It might be worth mentioning here that, of all the historians of the Progressive era, Turner has probably had the most contradictory posthumous career. It’s been a while since any scholar wholeheartedly endorsed his thesis about the closing of the American frontier. But it remains a landmark -- if only the kind used by later generations for target practice -- and I doubt a non-historian can watch "Deadwood" for very long without reinventing some approximation of Turner’s notion that the national character was shaped down to its cells by the Western edge of expansion.

Anyway, as Rauchway was saying, before I so digressively interrupted....     

“There's some evidence that [the show’s characters] are safety-valve types. They're people who say, as Ellsworth does, that they might have "fucked up their lives flatter than hammered shit, but they're beholden to no human cocksucker".... But they're not, Turner-style, out there to get an opportunity to civilize themselves. Which is to say, they don't go West because only there can they get a patch of land and settle, Jeffersonian-like, into civilization.”

Rather, people finding their way to the mining town are looking for a new start -- often because the economy has destroyed their other options.

“In several conversations on 'Deadwood',” notes Rauchway, “we've been told that these people have bumped into each other in other boom towns, before those booms went bust, and now their predilections have brought them here. And we can infer that soon they'll move on again. If they're the advance agents of civilization, they're doing that work unwillingly.”

And the civilization they create reflects that restlessness. The first two of "Deadwood"’s three seasons told a story about people slowly -- almost unwittingly -- establishing a social contract. A swarm of disconnected and sometimes violent individuals created a rough semblance of order (with the emphasis in “rough”). It was not so much a matter of coming to trust one another, as learning the limited utility of constant suspicion and fear.

This past season led up to the town’s first election -- an initial step toward the eventual incorporation of the territory into the United States, proper. But that bit of progress only comes at the cost of sacrifice: the destruction of that order we have watched grow over time. A new regime emerges, now under the control of a consolidated mining operation.

The final image of the series really did sum it up perfectly. It shows a man on his knees, scrubbing a pool of blood off the wooden floor.

Another character, Johnny, has just asked for some reassuring words about the event that led to the giant stain. Johnny leaves, and the man with the brush gets back to work. "Wants me to tell him something pretty," he says.

It's not a rebuke, exactly -- just a reminder that, as someone once put it, every document of civilization is also a document of barbarism.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

YouTube and the Cultural Studies Classroom

"I saw a small iridescent sphere of almost unbearable brightness. At first I thought it was spinning; then I realized that the movement was an illusion produced by the dizzying spectacles inside it."
                  --Jorge Luis Borges, "The Aleph"

On December 17, 2005, “Saturday Night Live” ran a skit by Chris Parnell and Andy Samberg called "Lazy Sunday," a rap video about going out on a "lazy Sunday" to see The Chronicles of Narnia and procuring some cupcakes with "bomb frostings" from the Magnolia Bakery in New York City. The rap touches on the logistics of getting to the theater on the Upper West Side: "Let's hit up Yahoo Maps to find the dopest route./ I prefer Mapquest!/ That's a good one too./ Google Maps is the best!/ True that! Double true!/ 68th and Broadway./ Step on it, sucka!"

Parnell and Samberg make it to the Magnolia for their cupcakes, go to a deli for more treats, and hide their junk food in a backpack for smuggling past movie security. They complain about the high movie prices at the box office ("You can call us Aaron Burr from the way we're dropping Hamiltons") and brag about participating in the pre-movie trivia quiz. Doesn't seem like much if you've never seen it, but for pure joie de vivre, and white suburban dorkiness, "Lazy Sunday" just can't be beat. What makes "Lazy Sunday" special, however, is how its original airing coincided with the birth of Internet video-sharing, enabling the two minute clip to be viewed millions of times on YouTube, a free service that hosts videos posted by users. In fact, the popularity of the clip on YouTube was so great that NBC forced the site to remove it several months later, citing copyright infringement. The prospect of its programming being net-jacked by Internet geeks and magnified through YouTube's powerful interface was just too much for NBC.

I bring up "Lazy Sunday" to foreground my discussion of the pedagogical uses of YouTube because it sums up its spirit and helps us define the genre of video with which YouTube is most associated. Although YouTube is awash in clips from television and film, the sui generis YouTube video is the product of collaborative "lazy Sunday" moments when pals film each other or perform for the camera doing inane things like dancing, lip synching or making bottles of Diet Coke become volcanic after dropping Mentos candies in them.

Parnell and Samberg's references to Internet tools and movie trivia, as well as their parody of rap, perfectly capture a zeitgeist in which all pleasures can be recreated, reinvented and repeated ad nauseam through the magic of the Web. As Sam Anderson describes it in Slate, YouTube is "an incoherent, totally chaotic accretion of amateurism -- pure webcam footage of the collective unconscious." Whatever you're looking for (except porn) can be found in this Borgesian hall of mirrors: videos of puppies, UFO footage, ghosts on film, musical memento mori about recently deceased celebrities, movie and documentary clips, real and faux video diaries, virtuoso guitar picking performances and all kinds of amateur films. In my case, the video that sold me on YouTube was "Where the Hell is Matt Harding Dancing Now?" -- a strangely uplifting video of a guy called Matt Harding who traveled around the world and danced in front of landmarks such as Macchu Picchu in Peru, Area 51 in the U.S., the head-shaped monoliths of Easter Island, and the Great Wall of China, among many others.

OK, that's all nice, but what can YouTube do for professors, apart from giving them something to look at during their lunch breaks? Inside Higher Ed has reported on the ways in which YouTube is causing consternation among academics because it is being used by students to stage moments of guerilla theater in the classroom, record lectures without permission and ridicule their professors. Indeed, a search on YouTube for videos of professors can bring up disquieting clips of faculty behaving strangely in front of their students, like the professor who coolly walks over to a student who answers a ringing cell phone in class, politely asks for the device, and then violently smashes it on the floor before continuing on with his lecture as if nothing had happened. It could be staged (authenticity is more often than not a fiction on YouTube) but it is still disturbing.

But I would like to argue for an altogether different take on YouTube, one centered on the ways in which this medium can enrich the learning experience of college students by providing video realia to accompany their textbooks, in-class documentaries and course lectures. Although I can't speak to the applicability of YouTube to every discipline, in what follows I make a case for how the service can be harnessed by professors in the humanities and social sciences.

As a professor Latin American literature and culture, I often teach an introductory, third year course called Latin American Culture and Civilization in which students study history, literature and any other media that the instructor wishes to include in the course, such as music, film, comics and the visual arts. My version of the course emphasizes student engagement with foundational documents and writings that span all periods of Latin American history and that I have annotated for student use. One of the figures we study is President Hugo Chávez of Venezuela, whose outsized political persona has made him a YouTube star. Apart from having my students watch an excerpt of his "Bush as sulfurous devil" speech at the United Nations, I assigned a series of animated cartoons prepared by the Venezuelan state to educate children about the Bolivarian constitution championed by Chávez. These cartoons allow students see the ways in which the legacy of the 19th-century Venezuelan Liberator, Simon Bolívar, remains alive today.

The textual richness of these cartoons invites students to visually experience Bolivarian nationalism in a way that cannot be otherwise recreated in the classroom. It invites them to think critically about the ways in which icons such as Bolívar are creatively utilized to instill patriotism in children. In a similar vein, a Cuban cartoon about Cuba's founding father, José Martí, depicts how a child is transformed into the future champion of independence and social justice when he witnesses the horrors of slavery (this video has now been removed from YouTube). With regard to the Mexican Revolution, one of the most important units of the class, YouTube offers some fascinating period film of the revolutionary icons Emiliano Zapata and Pancho Villa, and especially their deaths. Although I cannot say that these are visual texts that lend themselves to the kind of rich dialogue provoked by the aforementioned cartoons, they are nonetheless an engaging visual complement to readings, discussions and lectures.

Another course in which YouTube has played a part in is my senior-level literature course on the Chilean Nobel Laureate Pablo Neruda. It may seem farfetched to use Internet video in a poetry class, but in this case, YouTube offers several useful media clips. I have utilized film clips in which Neruda's poetry appears (such as Patch Adams and Truly, Madly, Deeply), as well as music videos of Latin American singers who use lyrics by Neruda. More than anything that I could say in class, these videos illustrate the reach and enduring quality of Neruda's poetry in Latin American and North American culture. This said, there are a surprising number of student-produced videos about Neruda on YouTube that are cringe-worthy, the "Lazy Sunday" versions of the poet and his poetry. These are quite fascinating in of themselves as instances in which young people use video to interpret and stage Neruda, in ways that might be set into dialogue with more literary and canonical constructions of his legacy, but I confess that I am not yet convinced of their pedagogical value.

In this regard, the case of Neruda is not so different from that of other literary figures, such as Emily Dickinson, Nathaniel Hawthorne and Robert Frost, who are also the subject of interesting home-made YouTube videos. What do we do, for example, with a Claymation film that recreates Frost's "The Road Not Taken"? I would argue that this film is interesting because it captures the banality of a certain canonical image or version of Robert Frost that is associated with self-congratulatory, folksy Hallmark Card moments.

There are all kinds of video with classroom potential on YouTube. Consider, for example, one of YouTube's greatest stars, Geriatric1927, a 79 year-old Englishman whose video diaries document his memories of World War II, as well as of other periods of English history. Then there are the Michel Foucault-Noam Chomsky debates, in which Foucault sketches out, in animated, subtitled conversation, the key arguments of seminal works such as Discipline and Punish. There's an excellent short slide show of period caricatures of Leon Trotsky, news reels and lectures about the Spanish Civil War, rare footage of Woody Guthrie performing, Malcolm X at the University of Oxford, clips of Chicana activist Dolores Huerta discussing immigration reform and a peculiar musical montage, in reverse, about Che Guevara, beginning with images and reels of his death and ending with footage of him as a child.

Don't let me tell you what you can find; seek and ye shall receive.

YouTube is not necessary for good teaching, in the same way that wheeling a VCR into the classroom is not necessary, or bringing in PowerPoint slide shows with images, or audio recordings. YouTube simply makes more resources available to teachers than ever before, and allows for better classroom management. Rather than use up valuable time in class watching a film or video clips, such media can be assigned to students as homework in the same way that reading is assigned. However, to make it work, faculty should keep in mind that the best way to deliver this content is through a course blog. YouTube provides some simple code that bloggers can use to stream the videos on a blog, rather than having to watch them within the YouTube interface. This can be important because we may not want students to have to deal with advertisements or the obnoxious comments that many YouTube users leave on the more controversial video pages. On my free wordpress.com course blog, I can frame YouTube videos in a way that makes them look more professional and attractive ( sample page here). At this point, courseblogging is so easy that even the least technologically-minded can learn how to use services like blogger or wordpress to post syllabi, course notes and internet media.

There are problems however, the most glaring of which is the legality of streaming a clip that may infringe on copyright. If I am not responsible for illegally uploading a video of Malcolm X onto the web, and yet I stream it from my course blog, am I complicit in infringing on someone's copyright? Now that Google has bought YouTube, and a more aggressive purging of copyright protected works on the service has begun, will content useful for education dwindle over time? I don't have the answers to these urgent questions yet, but even in the worst of cases, we can assume that good, educational material will be made available, legally, on YouTube and other such services in the future, either for free or for a modest fee.

For example, I am confident that soon I will be able to tell my students that, in addition to buying One Hundred Years of Solitude for a class, they will have to purchase a $5 video interview with García Márquez off of the World Wide Web and watch it at home. And, even as I write this, podcasting technologies are already in place that will allow faculty members to tell their students that most of their lectures will be available for free downloading on Itunes so that class time can be used more productively for interactive learning activities, such as group work and presentations. Unlike more static and limited media, like PowerPoint and the decorative course Web page, video and audio-sharing help professors be more creative and ambitious in the classroom.

In sum, my friends, YouTube is not just for memorializing lazy Sundays when you want to "mack on some cupcakes." It can help your students "mack" on knowledge.

Author/s: 
Christopher Conway
Author's email: 
info@insidehighered.com

Christopher Conway is associate professor of modern languages  and coordinator of the Spanish program at the University of Texas at Arlington, where he teaches Latin American literature and culture.

Beyond the Context of No Context

Well, so much for the instantaneous availability of information: I've only just learned about the death of George Trow, whose passing, almost two weeks ago, was noted among some of the blog entries ( this one, for example) regularly channeled through my RSS feed. There is a bitter irony in this situation, and most if it is at my own expense.

Nobody was smarter than George Trow about the bad faith that comes with being "plugged in" to streams of randomized data. He once defined a TV program as "a little span of time made friendly by repetition." (Friendly, the way a con man is friendly.) That was long before most of us started spending ever more of our lives in front of another kind of screen.

Perhaps the name does not ring a bell.... George W.S. Trow, who was 63 when he died, can best be described as a minor American author (no insult intended, it's a better title than most of us will ever merit) who wrote fiction, essays, and the occasional screenplay. Two years ago, the University of Iowa published The Harvard Black Rock Forest, which first appeared in The New Yorker in 1984.

It was the kind of piece that people once had in mind (maybe with admiration and maybe not) when they thought of "a New Yorker article" -- stately in pacing, full of deep-background references, heedless of breaking-news type topicality. Iowa included the book in a series on literary nonfiction. That makes sense, but it's also been hailed by the journal Environmental History as something "every student of the history of conservation should read. Twice."

But it was another essay by Trow that really defined him as a writer to reckon with. "Within the Context of No Context" ran in The New Yorker in 1980 and was brought out the next year by Little, Brown as a book. It was reprinted by Atlantic Monthly Press in 1997 with a new introduction by Trow. He also published a kind of supplement to it, My PilgrimÂ’s Progress: Media Studies 1950-1998 (Vintage, 1999). I say "supplement" and not "sequel" because the two books cross-connect in all sorts of odd, nonlinear ways.

Odd and nonlinear "Within the Context of No Context" itself certainly is. It is short, consisting of a number of brief sections. They range from a single sentence to several paragraphs, and each section has a title. While brief, the text actually takes a while to read. The relationships among the parts are oblique, and some of the prose has the strange feel that you would probably get from a translation of Schopenhauer done by Gertrude Stein.

"Within the Context of No Context" is about television, among other things -- about the history of the mass media, with television as its culminating moment, but also about what TV does to the very possibility of understanding the world as having a history. It is an essay in cultural criticism. But it can just as well be called a work of prose poetry. Trow's thoughts unfold, then draw back into themselves. This is very strange to watch.

After a quarter of a century, it may be difficult to appreciate the originality and insight of  Trow's essay. He seems to be making points about the media that are now familiar to almost everyone. In 1980, though, they were not so obvious. It's not that he was venturing into futurology. Nor was Trow a sociologist or historian, except in the most ad hoc way. He did not offer theories or arguments, exactly, but took notes on the texture of American life following three decades of television.

He was describing long-emerging qualities of everyday experience that had been quietly taking over the entire culture. He assumed existing tendencies would continue and deepen. It was a smart bet, but a depressing one to win.

Trow's central intuition was that TV played a decisive role in shaping "the new scale of national life" in the United States following the second World War. As the scion of a New York publishing family, Trow has various points to make about the shift of power from established WASP elites to the new professional-managerial class. (That social subtext is fleshed out with abundant and eccentric detail in My Pilgrim's Progress.) But those structural changes were occurring behind the scenes. Meanwhile, the national consciousness was changing 

Having won the war, the country was starting to come to terms with its own place in the world as an incredibly affluent society holding hitherto unimaginably military power. At the same time, we were starting to watch TV. We were starting to see the world through its eye. These two developments (a new level of power, a new kind of passivity) coincided in ways it was easy to overlook, just because the process was so ubiquitous and inescapable.

More than print or even radio ever had, television could address an audience of millions simultaneously. "It has other properties," he wrote, "but what television has to a dominant degree is a certain scale and the power to enforce it." And the medium's sense of scale was defined by two grids: "The grid of two hundred million," as Trow put it, "and the grid of intimacy."

Trow does not spell out in any detail what he means by "the middle distance" -- the regions of the culture left out of the TV "grids." But by implication, it seems to include most of what's usually called civil society: the institutions, meeting places, and forums for discussion through which people voluntarily associate.

His point isn't that the media completely avoid representing them, of course. But TV does not really encourage participation in them, either. Watching it is an atomized experience of being exposed to programs crafted to appeal to tens of millions of other people having the same experience.  

"Everything else fell into disuse," wrote Trow. "There was national life -- a shimmer of national life -- and intimate life. The distance between these two grids was very great. The distance was very frightening.... It followed that people were comfortable only with the language of intimacy."

And it has a cumulative effect. Not so much in the sense that TV destroys the mediating institutions of civil society -- you know, how there used to be bowling leagues, but now everybody is bowling alone. Rather, it's that the yearning for "mirages of pseudo-intimacy" (as Trow puts it) becomes a routine part of public life.

Off the top of my head, I do not remember 1980 well enough to recall what Trow might have had in mind, at the time. Perhaps it was the interviewing style of Barbara Walters, or Jimmy Carter admitting that he had lusted in his heart. Today we are far downstream. The "grid of two hundred million"  has become the grid of three hundred million. And finesse at handling the routines of "pseudo-intimacy" now seems like a prerequisite for holding public office.

What you also find tucked away in Trow's gnomic sentences is the anticipation of countless thousands of broadcast hours in which people discuss personal problems before a vast audience of strangers. The media would create, he wrote, "space for mirages of pseuo-intimacy. It is in this space that celebrities dance. And since the dancing celebrities occupy no real space, there is room for other novel forms to take hold. Some of these are really very strange." No one had thought of "reality TV" when Trow wrote this. The idea of becoming famous by leaking videotapes of oneself having sex had not yet occurred to anybody.

What makes the essay powerful, still, is that the word "television" now tends to fade from view as you read. It serves as a synecdoche. It is a name for the whole culture.

"Television is dangerous," wrote Trow in one haunting passage, "because it operates according to an attention span that is childish but is cold. It simulates the warmth of a childish response but is cold. If it were completely successful in simulating the warmth of childish enthusiasm -- that is, if it were warm -- would that be better? It would be better only in a society that had agreed that childish warmth and spontaneity were equivalent to public virtue; that is, a society of children. What is a cold child? A sadist."
 
Over time, the media-nurtured attention span ceases to comprehend anything outside its own history. As Trow put it in a line giving his essays its title: "The work of television is to establish false contexts and to chronicle the unraveling of existing contexts; finally, to establish the context of no context and to chronicle it."

That seems much less like a Zen koan today than it did when first published. It now often feels as if the people making decisions in the media world were deliberately using Trow's work as a guidebook for what to do next: A program like "I Love the '90s" is a literal effort "to establish the context of no context and to chronicle it." (In another line already quoted, Trow anticipated a certain now-familiar tone of nostalgic hipster posturing: "It simulates the warmth of a childish response but is cold.")

My precis of here is selective. It traces one or two strands woven into a very complex pattern. The most it can do is to encourage a few more readers to read Trow himself.

"George W. S. Trow is a sort of tragic hero," as the novelist Curtis White wrote in the best commentary on him I've seen. "His essays offer us clues to how we might correct our national life. But his wisdom is likely to be lost on us, even on those who would agree with him. Like Cassandra, he can tell us things that are true and that would save us if we could understand them, but his working premise seems to be: You will not understand what I am going to say. In fact, why we won't understand is a large part of the truth Trow has to tell us."

Yes, but that's why you find yourself reading him over and over.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Cultural studies
Back to Top