Anthropology

Travels in Weblogestan

Reading the scholarship devoted to the phenomenon of blogging sometimes calls to mind a comment that Jackie Gleason is said to have made about people who review TV programs -- that it's "like writing about a car wreck for an audience made up entirely of  eyewitnesses."

Not that researchers shouldn't gather data from LiveJournal, or make connections between the blogosphere and Jurgen Habermas's Structural Transformation of the Public Sphere, or whatever. But even the most impressive work tends to tell you something you already know, more or less.

For example: A recent number-crunching analysis of political blogging during the 2004 election demonstrated, among other things, that conservatives have created a dense online social network -- one with strong links among sites, that is, making them an effective medium for focusing on a particular topic or message.

Bloggers to the left, by contrast, have created a much less compact and efficient network. The tables and charts that the researchers prepared to demonstrate this are impressive enough. Even so, it all adds up to something slightly less incisive than an observation made, sooner or later, by anyone watching American political life: that there is an almost instinctive tendency on the part of self-identified "progressives" to cooperate just long enough to form a circular firing squad.

To be fair, the ideas and methods used in blog scholarship are sometimes more thought-provoking than the immediate results. That's especially true, it seems to me, with research using models of how networks emerge and function. (Then again, there is always something a little awesome about finding that "the unreasonable effectiveness of mathematics" that Eugene Wigner pointed out in the natural sciences also applies to human behavior en masse.)

But with an awful lot of work on the content and context of blogging, you have the Jackie Gleason effect in purest form. It isn't anybody's fault. The problem, arguably, is endemic to just about any kind of qualitative (that is, non-statistical) research on a new social phenomenon. In short: how do you get from offering a description to forming concepts? The conundrum may be even tougher with an emergent cultural form such as blogging -- one prone, that is, to incessant labors at self-definition, self-promotion,  and self-mockery.

So what would a really interesting and exciting piece of qualitative research on blogging look like? And how would it get around the problems of overfamiliarity with the phenomenon (on the one hand) and blogospheric navel-gazing (on the other)?

To get an answer, it isn't necessary to speculate. Just read "The Vulgar Spirit of Blogging: On Language, Culture, and Power in Persian Weblogestan," by Alireza Doostdar, which appears in the current issue of American Anthropologist. A scanned copy is available here. The author is now working at the Center for Middle Eastern Studies at Harvard University, where he will start work on his Ph.D. in social anthropology and Middle Eastern studies.

"Weblogestan" is an Iranian online slang term for the realm of Persian-language blogs. (The time has definitely come for it to be adapted, and adopted, into Anglophone usage.) Over the last two years, Western journalists  have looked at blogging as part of the political and cultural ferment in Iran -- treating it, predictably enough, as a simple manifestation of the yearning for a more open society. Doostdar complicates this picture by looking at what we might call the borders of Veblogestan (to employ a closer transliteration of the term, as used specifically to name Iranian blogging).

In an unpublished manuscript he sent me last week, Doostdar provides a quick overview of the region's population: "There are roughly 65,000 active blogs in Veblogestan," he writes, "making Persian the fourth language for blogs after English, Portugese, and French. The topics for blog entries include everything from personal diaries, expressions of spirituality, and works of experimental poetry and fiction to film criticism, sports commentary, social critique, and of course political analysis. Some bloggers focus on only one of these topics throughout the life of their blogs, while others write about a different topic in every new entry, or even deal with multiple topics within a single entry."

He notes that "a major factor in the widespread adoption of blogging" in Iran "has been the Unicode standard, which has made it possible for people to write and publish easily in the Persian script." Nor does it hurt that it is easy to set up a blog -- or to use a pseudonym. The result has been the creation of a medium that cuts across social and geographic boundaries. In his manuscript, Doostdar says that his work bought him into contact with "high school and university students, journalists, literary critics, Web designers, women's rights activists, and statesmen, living in Tehran, Toronto, Berlin, New York, London, Prague, and Paris, along with numerous other anonymous and half-anonymous bloggers scattered around the world."

Except for the part about writing Persian script using Unicode, this is a familiar picture of the blogging world. It is, in effect, a neighborhood within what Manuel Castells identified, some years back, as "the network society" -- a global "space without a place."  And Doostdar's account of the routine practices among Iranian bloggers will also ring a bell with their American cousins. There are group blogs, "trackback pings," comment fields, blogrolls, and even emoticons (the horror, the horror ;-).

At one level, then, it sounds like a new chapter in the worldwide spread of homogenizing mass media. The more globalization-friendly spin on this would be that blogging is a tool with which Iranians are creating a culture that challenges the fundamentalist social order.

Fortunately, Doostdar's work does not stick to either of these scripts. His paper in American Anthropologist looks at a controversy that raged during the final months of 2003 -- the bahs-e ebtezaal or "vulgarity debate," a heated discussion of the place of blogging in Iranian culture. On one side were members of the roshanfekr class -- meaning those writers and intellectuals possessing an "enlightened mind," but also a certain degree of education, sophistication, and social prestige. The term, writes Doostdar, "has historically come to represent one who is conversant with modernist or postmodernist discourses, is a humanist, feels a certain commitment toward the well-being of his or her won society, and continually and publically [criticizes] the values, norms, and behaviors of that society."

There are members of the roshanfekr classwho write for blogs, but they have other outlets as well, including newspapers and magazines. On the other side of the debate were Iranian bloggers who were "not intellectuals by social function or profession." The practice of blogrolling and cross-referencing allowed some of them to gain "popularity and a reputation within the community of bloggers."

But it was precisely the "focus on a contextual constitution of self" (with its attendant rituals of backscratching and mini-celebrity) that made blogging a venue for "a radically different set of priorities" from those of "the more 'noble' genres of traditional journalism and literary composition" practiced by the roshanfekr class. "In blogging," writes Doostdar, "speed often takes precedence over thoroughness, outlandishness over rigor, and emotive self-expression over dispassionate analysis."

In October 2003, Seyyed Reza Shokrollahi. a prominent journalist and literary critic, referred to "the stink of vulgarity in Weblogestan" -- complaining about the spelling errors, sloppy language, and low argumentative standards prevailing among bloggers. And as a nice touch, he did this on his own blog. The effect, as Doostdar put it, was to unleash "a cacophony of blog entries, online magazine articles, comments, responses, and counterresponses that continued for several weeks."

Some of the non- roshanfekr who denounced "intellectualist pretense" appear to have taken extra care to make errors in spelling and grammar when they replied. (As Doostdar puts it, they tried to "metapragmatically index themselves as linguistic and cultural rebels by being deliberately careless.")

And you can feel the seething bitterness of one blogger who denounced a prominent journalist and short story writer: "Keep mistaking this place as a literary conference when others consider it to be an informal and safe place for chatting. Come sit down wearing a suit and tie and mock those who are wearing jeans."

The populist tone is familiar. Change the accent, and it wouldn't sound out of place on Rush Limbaugh's radio show. And yet the lines in the Iranian vulgarity debate were not drawn for the convenience of American pundits.

For one thing, it isn't the familiar story of democratic reformers versus fundamentalist mullahs. It's more complicated than that. The liberalizing influence of the roshanfekr intelligentsia, "although significant, is still small relative to the dominant traditionalist clergy," writes Doostdar. "Their strongest cultural and political leverage is most likely among academics and in the domain of print media..." Going online gives them "a much less restricted environment for publication and cultural-political action" -- but in a space where "just about anything can (and does) get published and there is no authority to enforce linguistic and cultural standards."

The result? Well, consider the case of Seyyed Reza Shokrollahi, who launched the initial salvo against  "the stench of vulgarity in Weblogstan." Shortly afterward, he created a Web page with links to online editions of fiction that is censored in Iran. But according to Doostday, some "charged that he wanted to stifle free speech" with his criticism of vulgarity, "and compared him to government censors."

On Thursday: Using a Soviet dissident theorist's work to think about the blogopshere Also: is there a "pious spirit of blogging" in Iran?

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Scott McLemee writes Intellectual Affairs on Tuesdays and Thursdays.

Lazy Me

The café, located one block west of the University of Texas at Austin, was called Les Amis. Which, when pronounced with a certain drawl, or after a few Shiner Bocks, sounded like “Lazy Me.” It was just around the corner from the house where, as legend has it, Janis Joplin had lived during her student days. It would be nice to imagine that she might have visited Les Amis, but that is a stretch: It only opened its doors in 1970, about the time she died. In principle, though, yes, it was her kind of place. The café was the definitive landmark of the area known as West Campus.

Like any real neighborhood, West Campus was not just a place but a state of mind. It was a pole of attraction for people who endured living in Texas only by going into a kind of internal emigration. And Les Amis was a vital part of the cultural infrastructure. For one thing, the food was cheap. Generations of semi-successful musicians, struggling artists, and would-be writers budgeted their lives around the bowl of beans and rice (with cheese, if you paid extra). And because the entire wait staff seemed to be on perpetual cigarette break, it never felt like you were being rushed out the door. You could read a book without being hassled. Or write one, for that matter.

It was not, strictly speaking, a part of UT. But in no sense did it exist apart from the university. Les Amis was, so to speak, a non-academic outcropping of what Pierre Bourdieu called skhole -- that is, the open-ended space-time of scholastic life, in which questions can be raised and explored in a free discussion that evades any outside demands.

It's not that Les Amis was a unique outpost of this. On the contrary, any decent campus must have such pockets of creative impracticality -- places where people mingle, where loitering is permitted, even encouraged. They are the laboratories where conversation becomes a kind of experiment, and where you can opt out of normality for a while. (Maybe forever.)

Many a dissertation got drafted at Les Amis, or at least studiously procrastinated over. In the corner sat Rock Savage (at any given time, the drummer for five different bands) having breakfast at two in the afternoon, inscrutable behind his shades. At a table facing the sidewalk -- a few feet from the skinheads with skateboards -- you might find a Habermasian and a Derridean making an elaborate show of tolerating one another’s pathetically inadequate arguments. Meanwhile, inside, the ambience was forever that of an unwashed ashtray. In the booth near the door was a young couple that had recently broken up, doing a post-postmortem on their relationship, in lieu of burying it and moving on.

Such a life has its own tempo, its own logic. It can be liberating, but it can also be stultifying. You might leave it with a sense of relief -- only to find, years later, that a moment of nostalgia will blindside you.

I left Austin in 1988. Ten years later, meeting a fellow Les Amis alum at a party in Washington, D.C., I learned that the café had gone out of business in 1997. It was a shock to hear this: In some vague way, I always expected to return one day for a visit. There had been something sustaining about the fantasy of once again ordering the rice and beans with cheese, and the pot of coffee they brought to your table -- then spending the afternoon trying to get the bill from the waiter.

Now that was impossible. I felt grief, but also disgust and anger. There was no imaginary escape route from a life of ambition, responsibility, and deadlines.

And things got worse. The spot where Les Amis had once stood, my friend reported, was now occupied by a Starbucks. Fate was really laying it on thick.

But thanks to the efforts of Nancy Higgins, a young filmmaker in Austin, some of the memories have been preserved in a documentary, "Viva Les Amis." For now, the film doesn’t have a distributor, though it is available for purchase on DVD through a Web site. Higgins also indicates, in an e-mail note, that she is selling it “out of the trunk of my car.”

I can well believe she means that literally: "Viva Les Amis" has the feel of a labor of love – something made without much thought for whether it could be marketed. Higgins spent four years and something close to $40,000 making it. “No one ever got paid for their time,” she says, “including me and the people who helped me shoot it.”

Breaking even on the project would be nice, but it may take a while. “I have debt from the movie,” Higgins told me. “But I’ll pay it off someday.”

For now, though, she has earned the glory that goes with retrieving something valuable from the wreckage of progress, so called. Drawing on interviews with staff and customers, video footage shot during the 1980s, and a series of beautiful black-and-white photographs taken by Alan Pogue across the café’s three decades, Higgins evokes the feeling of community that, for many people around the university, made Les Amis a home away from home.

Almost literally so, in some cases. One of its denizens indicates that his record for hanging out there was 12 hours. Another interview subject recalls sleeping on the floor after it had closed for the night. The inner mysteries of Les Amis are revealed to outsiders. There was, for example, a walled-in area behind the café where the staff enjoyed beer, various smokable substances, and the occasional moment of fornication. (That would certainly tend to explain some things about the service.)

Higgins clearly has a feel for the place, so I asked her how she came to make the film. As a philosophy major at UT, she says, she spent a lot of time at the café, whether studying or otherwise. After graduating in 1994, she faced the perennial question of what you do with a liberal arts degree. You can probably see where this is going: For the next three years, she waited tables at Les Amis.

“My parents were so pleased,” she recalls (sarcasm mode on). “I took history and theory film classes at U.T., and read and did yoga and lived the relaxed Austin life. I stopped thinking about the future and just lived for awhile.”

She then headed to Emory University to do graduate work on avant garde and documentary film -- returning to Austin in 1999 with a master’s degree. After two years of watching other people’s films, she wanted to make one of her own.

“So,” she says, "I began working on a documentary about Les Amis -- a place that I missed terribly upon returning to Austin. Sometimes it's just the perfect night to go there, and you can't. That makes me really sad. I just wanted to preserve some of Les Amis before it disappeared from everyone's collective memory. I knew I wasn't the only one who grieved the loss.”

While various still photographs and home videos helped document the history of the café, nothing really captures the mood of the place like Richard Linklater’s " Slacker," which was a breakthrough independent film when it was released in 1991. That film had the misfortune to get swept up in the whole “Generation X” phenomenon, which had the effect of making the quiet and idiosyncratic enclave of West Campus seem like some kind of prefabricated lifestyle.

A few of the most memorable scenes in "Slacker" were shot at Les Amis -- and Linklater gave Higgins permission to use those clips in her documentary. “He knows how hard it is to secure rights,” she says. “At the time I asked him, he owned the rights to "Slacker," so we signed the paperwork and he let me use the scenes.” Linklater also included an extensive promotional spot for "Viva Les Amis" when the DVD edition of "Slacker" was released.

Higgins has done more than put together a video scrapbook. "Viva Les Amis" is also an essay on development -- on how the texture of life changes when a small business disappears, replaced by a corporate chain.

She interviews employees who work at the Starbucks that now occupies the block. They’ve heard rumors that another coffee shop once existed in the area, but don’t know anything about it. Sic transit gloria mundi, of course -- no surprise there. But it is certainly striking to listen to the young baristas as they describe what it is like to work there: the exacting dress code, the precisely formulated rules for interacting with customers, the system of corporate spying that makes sure each drink is served at the same temperature.

You can’t imagine a poetry reading taking place in such an environment. No doubt it is more efficient and profitable than Les Amis ever was. But the drive to uniformity and perfect top-down control seems joyless, no matter how much Bob Marley they play over the loudspeaker. I kept thinking of a scene from "Slacker" in which a local eminence named Doug the Slug stared into a video camera, announcing: “Every commodity you produce is a piece of your own death!”

I haven’t been back to Austin, and wondered about the changes reflected in "Viva Les Amis." It seemed like a good time to reconnect with Michael King, who was an assistant professor of English at UT when I met him in the early 1980s. Today King is the news editor for The Austin Chronicle, the local alternative weekly.

He remembers long lunches and late nights at the café, “drinking and talking with students and friends or other faculty, talking in the way that only a college community can do. I miss it.” But the city has grown, and the university helped drive the transformation.

“Austin and UT were simply much smaller then,” he says. “Although 30,000 students were plenty, they did not overwhelm the UT area in the way that 50,000 do.... It meant that sidewalk life around the university was a little sleepier, a little friendlier.... West Campus in particular has just been overwhelmed by numbers. The high-rise private dorms pour out students, night and day, and the street crowds can seem like Manhattan, without any of the amenities.”

He points out that there are new venues with something of the old Les Amis feel, such as Ruta Maya or Café Mundi -- the latter, for example, being the scene of a recent literary reading/oil-wrestling contest. But such places have, he says, “been physically pushed away from the campus, which, close in, is very much a crowded, rushed, gritty place.” In "Viva Les Amis," Nancy Higgins interviews the proprietor of Café Mundi, who says she worries that Starbucks will decide to open a shop down the street.

So far, the documentary has not been screened at film festivals -- and Higgins says she can’t afford to apply to any more, because doing so is expensive. It seems like a film that will find its audience, over time. “I like the idea of taking it on a tour of campuses,” she told me, “mainly to college towns like Austin that may be experiencing similar growing pains. But I haven't had a chance to try that yet.” 

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Jewish in Polynesia

As a Jewish professor, I know that it is my lot in life to deal with stereotypes of Jewish academics. As a Jewish professor from California, dealing with these stereotypes is even more difficult because I lack recourse to the solution favored by many colleagues: acting as if the complex negotiation of my identity can be accomplished simply by assuming that "Jewish" means "from New York" and leaving it at that. As a Jewish professor from California who teaches in Hawaii, navigating my identity as a practicing Reform Jew, both in the classroom and out, has taken many surprising twists and turns.

Oxford University Press's Judaism: A Very Short Introduction notes astutely that Jews, like tomatoes, are neither "particularly complicated or obscure when left to themselves, but they don't neatly fit into the handy categories such as fruit or vegetable or nation and religion which are so useful for pigeonholing other foods and people." Growing up in northern California, I went to a high school where the blanket term "Asian" was scrupulously decomposed into a wide variety of ethnicities, which included not just Chinese, Japanese, and Korean, but Hmong, Miao, Mien, Lao, Hongkie, Taiwanese, and so forth. When I got lunch at Vang's convenience store, my Thai friend grumbled about "those hill people." But for him, as for me, there was only one kind of white person: the white kind.

It was not until I moved to graduate school in Chicago that I realized that there were different kinds of white people. Growing up in Reagan's America, "Marxism" to me meant a critique of the soullessness of suburban life. Exploitation was not about class -- it was about Mexican-Anglo relations. While I understood that my religion made me different from most people, it didn't seem to make me any more distinctive than the guy in my class whose family had a time-share in Tahoe: I missed a few days of class for high holy days, he missed them for the time share. But living on the south side of Chicago, class became an inescapable fact of life, and "color" meant "black" and "white."

That I could understand. But I was particularly puzzled by religion as a source of social differentiation in America. I traveled to Minnesota and visited small towns, which featured intersections with churches on every corner. Why did the Missouri Synod Lutherans need one church and the Wisconsin Synod Lutherans need another? And how was all this related to the graduate student parties where bizarre passes would occasionally be made at me by women whose complex psychological relationship with novels like Portnoy's Complaint and Ravelstein had driven them out of their dairy-rich farming communities and into the arms of a cosmopolitan intellectualism which they expected me to embody?

My dissertation committee consisted of three Jewish structuralists and a Protestant interested in performativity. The Protestant member of my committee claimed that reading Kierkegaard's analysis of the sacrifice of Isaac through a Derridean lens could help explain nationalism in Indonesia, but this was the closest I actually got to Judaism as a religious phenomenon. Actually that is not true. At one point as I was driving in the car with one member of my committee, she pointed out a kosher butcher shop and told me that that was where another member of my committee went "for really good meat." But that was it -- my committee was alarmed when I suggested that Judaism was not actually synonymous with being an atheist intellectual, or even who knew where to get a pound of lean pastrami.

I originally felt my move to Hawaii would be a sort of homecoming -- a return to the multicultural environment of my childhood and an end to the terrible, terrible cold I had suffered through in the Midwest. In fact I was in for a bit of a shock. Hawaii has a unique local culture derived from the state's legacy of plantation colonialism and its overthrow at the hands of a strong labor movement. As a result Hawaii owes much to the Japanese, Chinese, and Portuguese workers who moved here to cut cane. And of course there is the rich tradition of native Hawaiian culture, which has experienced a renaissance here in the past 30 years. Since the United States has long been the inheritor of Spanish colonialism in the Pacific, our islands are called home by increasing numbers of Chammoro and Filipinos. The growing number of migrants from Samoa and Tonga allows Hawaii to challenge Auckland as the unofficial capital of Polynesia. And there is no doubt that Honolulu -- the forward point for the projection of U.S. political and military power into the Pacific -- has long been a center for Micronesian migration.

Further, Hawaii has one of the highest rates of intermarriage in the country, and the place is remarkably cosmopolitan given its small size and distance from major centers. The result of all this is that my students are more likely to visit Saipan than Schenectady, and know more about Pago Pago than Paris. It soon became apparent that the welcome return to my natal relation to my Jewish identity was not to be had -- and for reasons more enduring than the fact that the Web site for my new shul in Honolulu was "shaloha.com."

A great deal of my Introduction to Cultural Anthropology course involves getting students to rethink ideas of race and ethnicity in light of the anthropological concept of culture. However, I feel very uncomfortable asking my students to objectify themselves in class by asking them "as an Asian, how do you feel about this?" or lecturing my African-American students about supposedly innate black athletic ability. On the mainland I solved this problem by objectifying myself and examining, for instance, stereotypes about Jews. In fact I typically use the tomato imagery from Oxford's Very Short Introduction to Judaism. "So," I said in my first class in the islands, dry erase marker in hand and ready to make a discussion-spurring list, "what are some stereotypes you have of Jewish people?"

Silence.

"Please," I say generously, "This classroom is a safe place where people can discuss controversial topics civilly, so don't feel you need to spare my feelings. So: what are some stereotypes of Jews?"

Silence. As a new professor I had read countless books and articles exhorting me not to get freaked out if it took a while for someone to say
something. But this time I get nothing. Nada.

"How about the idea that Jews are good with money?" I finally offer.

"You mean like they're pake?" asked one confused woman, using a
Hawaiian term originally meant to describe Chinese people, but which in
local slang simply means tight-fisted.

In this and other classes I quickly came to realize that when it comes to Jews the Hawaiian response to the question of "is a tomato a vegetable or a fruit" is to ask "what's a tomato?" In California, my identity as a Jew wasn't particularly relevant. In Honolulu, I am pretty much off the table insofar as the ethnic imagination of my students goes. All white people were haole -- a Hawaiian word with a slightly derogatory connotations (one of my students wears a T-shirt to class that reads "Haole you flew here I grew here".)

The problem was not just that my students didn't know that I was a tomato, they're often a little unclear on the idea that people must be sorted into fruits and vegetables. To put it another way: it is difficult to expose the culturally-contingent nature of your student's essentialist folk theories of identity when they have names like Motoko Kapualani da Silva or Brian Ka'imikaua Li. This latter student claimed to be "Japanese, Filipino, and Hawaiian." I pointed out to him that his last name was Chinese. He paused and thought about it for a second and then remembered that yes, his family was also Chinese but he had never really thought of "Li" as a Chinese name. By speaking frankly about my own identity with my students I learned that they did not operate with the same concepts of race and ethnicity that my students on the mainland did, and this insight allowed me to teach anthropology in a way that was accessible to them.

Now when I teach my intro class I engage my student's expectations about ethnic difference by approaching some of the aporias of identity in Hawaii. How does the selective retention of taboos and purity laws by orthodox Jews provide a model of how (or how not!) to creatively innovate one's tradition? Why do we speak of Hawaii as a multicultural paradise when there is so much racial tension simmering under the surface? Is the distinction of "local" and "haole" one of race? Of class? What does it mean to talk about "ancient Hawaiian tradition" with a professor whose people lived in diaspora for a millennium before the first Hawaiians arrived in the islands? Why are haole tourists noisy, rude, and overbearing compared to locals? How can we use the concept of culture to render this comportment intelligible?

In sum, living in Hawaii has forced me to rethink not just my own Jewish heritage, but  the issue of heritage in general. As an anthropologist I find the challenge of working through these multiple layers of identity and (as we say in the business) “group affiliation” to be both professionally and personally rewarding. And most important of all, it provides my students a chance to grow intellectually by thinking about identity and belonging in ways they may not have before. In the next few semesters I plan to put together a class entitled "Kohen and Kahuna," based on an unpublished manuscript by a Rabbi and sociologist who studied at the University of Hawaii in the 1940s. We will not discuss Woody Allen films or where to get a pound of lean pastrami. We will discuss the comparative study of taboo, social complexity in Polynesia and the ancient Middle East, and anxiety about not being able to chant properly in your heritage language. Shaloha, everyone!

Author/s: 
Alex Golub
Author's email: 
info@insidehighered.com

Alex Golub finished his dissertation in anthropology at the University of Chicago in 2005 and is now an adjunct professor at the University of Hawaii at Manoa. He blogs at Savage Minds, a group blog about cultural anthropology.

Christianity: You're Soaking in It

Inside Higher Ed's recent article on the prevalence of religion on campus came as no surprise to me. Although it has been about a month since the article came out, I still think that the time is ripe for me to make a confession: I actively incorporate the gospel of Christ into my teaching -- although not for the reason you might think.

I myself am not Christian -- as some readers may remember, I'm Jewish. I am, however, a passionate choral singer with an interest in music of the Baroque and Renaissance, and it is hard to find secular ensembles that perform this repertoire. As a result I spend a lot of time in church.

After I get done writing this, for instance, I'm off to my second gig of the week -- a compline service sung in candle light featuring a procession (often with handbells), the complete chanted Anglican rite for the office of compline, two anthems, a nunc dimittis, the office hymn (we have over 25 settings of te lucis ante terminum), an orison, and a psalm. The group itself is quite small -- 10 men (countertenors sing the treble parts) who rehearse without accompaniment in a 90-minute rehearsal before our performance.

I'm very proud that I have what it takes to sing in a group that operates at such a professional level. But singing in church means more to me than just pride in my musicianship -- it is part of the more general rapprochement I've had with Christianity. I went to a college where people were more likely to take acid than communion and my last visit to the Bay Area involved stopping at both anarchist communes and picking up equipment to broadcast an illegal radio channel from our car while road tripping up to Burning Man. So you see, attending church seems infinitely more transgressive of my socials norms than does, say, running around naked in the middle of the desert while dosed to the gills on synthetic mescaline.

This generally lefty background, combined with my religious background, means that Christianity was something that I only ever heard about from my friends, whose lifestyles were an elaborate form of rebellion against it. As a result it's been a bit of a surprise to me to discover that the religion had any redeeming features at all. But in fact my time as a chorister has given me the opportunity to meet Christians whose faith has led them to lives of remarkable compassion, caring, and integrity and to appreciate the power and value that the Christian faith has for its followers.

The other thing that singing in church makes you realize about Christians is that they're everywhere. When it first hit me, this revelation filled me with the same shock that fills those ladies in the old Palmolive commercials -- I'm soaking in it! These "palmolive" moments continued into the classroom and it soon became apparent to me that many of my students -- who looked perfectly normal -- actually considered Jesus Christ to be their personal savior.

I quickly noticed that being a Jewish church musician gave me something in common with my Christian students -- indeed, in some strange way I knew more about their religion than they did. This is because I am, like all church musicians, a liturgy junkie. My students think of the celebration we just observed as "Halloween," while for me it was the 20th Sunday after Pentecost. Many of my students know the Lord's Prayer by heart, but few of them can spontaneously spout the text of the Magnificat in both English and Latin. And while I occasionally get a student who knows that Easter is somehow tied to Passover, I've never encountered one who knew that Pentecost was actually Shavuot.

My decision to begin incorporating the anthropology of Christianity into my classes was premised on the belief that, academically speaking, Christianity could be used to soften hands while I did the dishes. That is to say, I realized that I didn't just have to let the fact that my classes were saturated with Christianity go unremarked. Rather than simply soak in it, I could use it to further the goals of the class. Even the fact that my students came from diverse faith backgrounds within and without Christianity could be foregrounded as a way of asking students to think through and share with each other exactly what their beliefs were.

The key, of course, is that the stance we take on Christianity in class be distanced and yet respectful. While I may feel that I'm soaking in it, Christian students see themselves to be an embattled minority in an increasingly secular society full of professors who belittle their beliefs in lectures on evolution and secular humanism. Beating up on my Christian students for their faith in the name of cultural relativism is simply not effective anthropology.

So while I have a gimlet eye for some of Christianity's more incongruous beliefs, I am someone who actively participates in the life of their faith community. I'm the guy who sings motets while everyone else takes communion -- in participant-observation in the classically anthropological sense. This sense of being both insider and outsider helps, I believe, to reassure students that our my analysis of Christianity is not meant to be a partisan exercise either for or against, but a demonstration of the power of social science to make taken-for-granted topics amenable to analysis.

The textbook in my "intro to anthro" class, for instance, has a chapter on the way symbolic action reinforces worldview through the use of compelling and culturally specific metaphors. It then takes examples of rituals from "other" cultures and demonstrates how these seemingly bizarre activities function, once you understand the metaphors at play within them. I, however, have given up teaching the Kwakiutl Cannibal Dance. Now I just teach communion -- my favorite Christian ritual after the procession on Palm Sunday. I begin by taking the belief of Christians that they can be made pure only through the cannibalistic consumption of their deity. How can we account for this belief?

I begin by having students explain what communion is to members of the class who are not familiar with it, and we pause to consider the special fact that practices within Christianity vary greatly from one church to another. This is, literally, anthropology 101: Cultural traditions are not internally homogeneous, but demonstrate a wide variation in practice ranging from Roman Catholics (who do believe themselves literally to be cannibals) to Lutherans (who hedge their bets with consubstantiation) to Mennonites (who may go their whole lives without taking communion).

Next, I begin slowly peeling away at the communion service, pointing out the metaphorical associations of consumption and identification which -- as in most Christian rituals -- derive their power by recreating in the here-and-now of the church an event from the there-and-then life of Jesus (I don't burden the students with technical terms like "metricalization of space" and "distal chronotope"). Students pick this up relatively quickly: In communion the priest is to Jesus at the Last Supper as the congregation is to the apostles, just as in the Palm Sunday processional, the priest becomes Jesus entering Jerusalem just as the congregation becomes the crowd welcoming his entrance into Jerusalem.

The original grounding event of the Last Supper thus becomes the source of a metaphorical identification. This is not the end of the matter, however, since the Last Supper itself is Jesus's own elaborate riff on the festival he was celebrating -- Passover. Passover itself is a here-and-now remembrance of the then-and-there event of the angel of death (creepily represented in The Ten Commandments by colored dry ice) passing over Hebrew houses marked with the blood of a lamb.

Having taken the students back to Exodus, we then begin working forward with the image of the blood of lambs, passing successively through pastoral imagery in the Hebrew Bible, prohibitions on the consumption of blood in Leviticus, and all the way forward again to the Roman Catholic liturgy in which God becomes not the shepherd who makes us to lie in green fields, but the lamb who takes away the sins of the world.

I typically wrap up by noting that these metaphors and identifications continue to circulate in our own culture and keep us "soaking" in Christianity. Recently, for instance, I've ended with the new Superman movie, which features a man sent by his omnipotent father to Earth who protects humanity using his supernatural powers, only to be defeated by The Adversary -- Kevin Spacey cum Lex Luthor on a gigantic island made of kryptonite -- but who rises again to triumph in glory and help Kate Bosworth quit smoking.

I think it is one of my better lectures of the semester. I can't take much credit for this fact, however, since the communion service is such a spectacularly well-designed ritual. And of course it's not like I figured all of this symbolism out -- it's a series of connections that Christian theologians have been quite articulate about.

Best of all, this discussion of communion leaves my students turned on to some of the central concerns of my discipline: What is the distinction between "the real" and "the cultural" in a situation where most Christians do not believe in transubstantiation and yet rely on tropes of incorporation (cannibalistically, when Jesus's body enters you, and communally, when this act of alimentation brings you into the "body" of the church) to give their rituals power? How do we characterize the awareness of a participant at a ritual event who "gets it" but may not be able to articulate the play of tropes they experience without their professor's help? What is the status of interpretive social science as a science if it consists (merely?) of re-presenting the knowledge of our informants in a new form?

And speaking of productive tropes -- what sort of metaphorical associations are invoked in my own lecture (a genre that in America at least has an unabashedly homiletic past) when I mobilize the then-and-now of the communion in my role as a maverick Jewish proselytizing for my own brand of human knowing to a classroom full of potential intellectual converts?

Years ago Gerald Graff argued that the best way to deal with the culture wars was to teach them. My own experience with Christianity, as atypical as it is, has led me to see the value of bringing it into the classroom and making it an issue with my students. Like a lot of worthwhile tasks, it's a tricky one. But I believe it's an important one and -- when done correctly -- fun as well. Above all, I think it is best to realize when it comes to religion in our schools, the issue is not necessarily what is being taught, but how it is being approached. After all, America is a country where we are "soaking in it" -- even if we are not as fully immersed as some would like.

Author/s: 
Alex Golub
Author's email: 
info@insidehighered.com

Alex Golub is an assistant professor of anthropology at the University of Hawaii at Manoa who blogs at Savage Minds. His last column was about joining the tenure track.

Don't Tell Me What I Said. I Know What I Meant

Talking and barriers to communication are not new topics for deans. We receive lots of opportunities for help with this aspect of our administrative role. All of them recognize, implicitly or explicitly, the central role of communication between deans and others. I receive notices for workshops and offers from consultants on “dealing with difficult employees,” on “how to deal with angry faculty members” (perhaps offer them coffee and/or jawbreakers to calm/slow them down….). There is a cottage industry of providers of higher education management advice regarding faculty-administrative relations.

I am in my 12th year as a dean at an independent residential liberal arts college, with approximately 1,300 students and over 100 full-time faculty members. My college places a high value on “community” as an institutional value and the significant faculty investment in the college. Similarly, while in principle, faculty know they can do their work anywhere, an institutional commitment (or institutional loyalty) vigorously competes with the disciplinary orientation of faculty. Faculty very quickly become embedded in courses, programs and friendships that link them with other faculty, often outside their department, and integrate them more strongly into the institution.

The small scale of my type of institution cannot be overemphasized. We are face-to-face everyday, whether in classrooms, in offices, in the snack bar or the gym. I work in a multiuse building, with some (not all) administrative offices on the main floor, but faculty and classrooms on the floors above. There are always students, faculty, and staff in the corridors. This small scale means that I am but a phone call or e-mail or shout across the lawn from most faculty, and that I am very much "their" dean. That is to say, responsible for anything on faculty minds. Beyond the formal responsibilities of personnel issues, reviews, curriculum, and so on, my perceived responsibilities and authority can include virtually anything relevant to faculty (objections to the ban on dogs on campus, disputes over the use of the whirlpool in fieldhouse, furor over certain trees being trimmed, the sense of insult created when the art department is not consulted on selecting the color of woodwork for a new building, etc.).

What are the day-to-day experiences like in a setting with the above characteristics?

Two fundamental principles of sociocultural anthropology play a role in the day-to-day experiences I have as dean.

The first can be found in the introductory chapters to most classic textbooks in sociocultural anthropology. Once referred to as the psychic unity of humankind, the more contemporary argument is, perhaps, along the following lines: Human beings are everywhere rational; yet particular rationalities are defined through and embedded in particular cultures. One of our principal goals as anthropologists is to discover the sense of some particular rationality, often one other than our own -- the sense that peoples’ lives make to them. That is to say, I make the assumption that faculty behavior makes sense, although at times I have to work to discover the sense that it makes -- and, of course, vice versa!

The second principle is not quite as old, but well-embedded in our field none-the-less. For me, as a sociolinguist, the version I’m most comfortable with comes from the work of the early ethnomethodologists in their studies of conversational structure and conversational interaction.

Harold Garfinkel, for example, writes in his now classic 1964 article, “Studies of the Routine Grounds for Everyday Activities” to bring our attention to the taken-for-granted scenes of “everyday” social life. He focuses on “common understandings” about the silent -- unstated -- common background understandings that give everyday life its “sense” and comprehensibility. He writes that: “…common understanding … consists … in the enforceable character of actions in compliance with the expectancies of everyday life as a morality.  Common sense knowledge of the facts of social life for the members of the society is institutionalized knowledge of the real world."

For example:

    Bill: Are you going to John’s party tonight?
    Al: I can’t face Karen.  (former girlfriend, now with John)
    Bill: You can come late.  (Karen will leave early to go to work, avoiding awkward contact)
 
Unstated common background information is the source of the coherence that link these statements logically to each other and give the event, the conversation, its coherence. And, importantly, the participants assume a coherence in the interaction; they expect each other to behave in sensible ways. It is this latter assumption of coherence that sends interlocutors “searching” for the sense that talk makes.

Both principles -- the psychic unity of humankind and the role of shared background information as a source of the coherence and efficacy of social interaction -- can be help us understand when interactions don’t work -- when there are breaches as well as when they do. Simply put, failing to take the other’s perspective can distort one’s understanding of their behavior, and create misperceptions of the motivations behind that behavior. Similarly conversations and interactions can have an illusory, and even misleading, coherence. The assumption of common background information can mask actual differences in those assumptions, including different objectives for the interaction itself, leading to several different realities experienced simultaneously and without recognition. Subsequent recognition of these differences often becomes exacerbated by the perception of the earlier “good” meeting; if it was a good meeting, and now something has happened inconsistent with that meeting, then someone can’t be trusted!

So can faculty members and administrators communicate? Can these (sub)cultures be bridged? Are we forever limited by our respective positions within the institution to experience a different, unique “reality”? 

I am guided in my response to this question by a lecture I recently attended by Zali Gurevitch, a visiting anthropologist, discussing the Israeli-Palestinian conflict. In his analysis, he referred to the difference between thick borders, where multiple cultures can be dealt with only by oppositions, or a multiculture which can be constructed by thin borders: neither X nor Y alone, but each is part of both, or something larger. I was inspired to think about differences within the college and the potential to integrate them into a larger coherent whole -- if persons would understand the points of view of others.

Here are two examples:

The tyranny of the techies. In an institutional planning committee meeting, participants were discussing the priority for a technology initiative. This led to expressions of frustration by faculty regarding available technology and frustration from the computer center staff about the lack of interest of many faculty in further technological innovations.

The technician complained that faculty don’t use the full potential of existing software. He argued that digital technology “is most powerful” in that it “changes (positively) the way we do things,” and complained that too many faculty simply wanted technology to do things in existing ways. For him, difficulties arise when a person is incompetent in the use of technology.

The faculty member argued that software and hardware often break down, keeping them for doing what they intend. “It’s not reliable. We want it to work.”

In these two comments, we can see very different approaches to computer technology. For the computer technician, faculty are forever disappointing the technology by not engaging with its potential. For the faculty member, technology is something we use and problems arise when the technology “breaks down.”

It wasn’t hard to get them on a different track by helping them understand each other’s point of view. And it was now possible to continue the conversation, enlightened by each other’s point of view and assumptions.

Why are we here? Staff are often ignored in those higher education discussions of (faculty vs. administrative) “cultures,” particularly middle-management, mid-level staff. It’s assumed that for most, it’s just a job, 8:00 a.m. – 4:00 p.m., with stipulated holidays, sick days, etc. In contrast, many faculty may continue to look at their work as a “calling” and a “way of life,” although this might be dismissed as nonsense by some. However, the small scale of the institution leads to nearly everyone’s direct and intensive contact with the primary constituency on our campus -- the students -- whose conversations, aspirations, and relations with staff often help those staff at least understand, if not embrace, the sense of community and common purpose to which we strive. The result is that a good proportion of the staff develops as much of a sense of “calling” as do some faculty -- although this may surprise those faculty.

A particularly dramatic piece of testimony came to me from a staff person who was reflecting on work at the college:

”Being here has lead to a great deal of personal growth, and it is truly the first place I've worked where I feel that I've been able to be a whole person with a well balanced life. [The college] has improved me and really given me the opportunity to change peoples lives... including my own. Everyone, at some point in their life, should spend a decade or more at a place like this. It's hard to pinpoint what "it" is that makes this place unique (and at times uniquely frustrating). Perhaps it's because we're trying to live up to an ideal ... rather than just run a business.  We're measuring ourselves beyond and aspiring to things that can't be reflected totally on a balance sheet. That's unique.”

All this suggests that although some of the subcultural differences may be real, they are not necessarily unbridgeable.  With some understanding that people inside the institution have perspectives, aspirations and perhaps even values that are shaped to an extent by the positions they occupy within the structure, it at least becomes possible to integrate those different perspectives into a larger whole and for each constituency to become informed by the perspectives of the others.

Don't Tell Me What I Said. I Know What I Meant

Misunderstandings that follow from seemingly coherent meetings, memos, and other events are bountiful in my world. One recent researcher on conversational misunderstandings has written:

“Whatever the trigger of a misunderstanding or the extent of our misunderstanding, misunderstandings constitute an ordinary feature of human communication. This amounts to relying on a model of communication in which understanding is not granted … communication is not a matter of replication or duplication of thoughts but rather, it entails a model of transformation and interpretation…”

I have never allowed myself to be surprised by the level of creativity, transformation and interpretation of events in which I have played a part and which I thought I understood. And what I rediscover time and time again is the need to find the shared assumptions that underlie conversation in my office.

Here are some examples:

1. Professors are not private investigators. The science division faculty provided the campus safety office with a list of students allowed after-hours entry to the building to work on laboratory projects. One day a student whose name was on that list complained to their biology advisor that a campus safety officer forced them to leave the building after-hours. The next day, the biology professor sent an e-mail to the campus safety director complaining about this -- a reasonable complaint. The campus safety director responded with an e-mail asking the professor to provide him with information on the time of the incident, where it occurred, and exactly what was said. In turn, the professor responded, with a very angry e-mail, stating that he was insulted with this request, that it was not his job to do investigations, that this was not “a productive use of [his] time”, and that the campus safety officer should simply do what he needed to ensure this doesn’t happen again.

[This example is, of course, exacerbated by the use of e-mail -- although the fact that it was all copied to me, piece by piece, raises a topic for some other paper on academic culture which one day I will write: that is, the need for an “audience” in person-to-person e-mails.]

I intervened, concerned at the vitriolic tone of the professor’s e-mail. I spoke with the director of security. There was no question that he wanted to correct his officer’s mistake but he needed more information. Based on his experience that students were sometimes reluctant to respond to his direct inquiries, and based on his understanding of the close faculty/student relationships, he thought the easiest and fastest way to get the information -- and get the problem solved -- would be for the faculty member to get the information from the student.

The director was not wrong-headed in his assumptions. He did not consider, however, that his request could be interpreted in other ways, namely, as asking someone else to do what should be his own work. The misunderstanding could have been avoided by the director asking for the information OR suggesting that the professor to put the student in contact with the campus safety officer.

2. Advising? Oh, that’s different. For several years, the college has been trying to improve the quality of advising. Here, as at most small schools, all advising is done by continuing faculty members. Upon enrollment, students are assigned an initial advisor based on area of academic interest; upon declaring a major (usually toward the end of the sophomore year, majors switch advisors to someone requested from their major department.

The director of academic advising is an associate dean of the college who hears regularly from students who are dissatisfied with their pre-major advising experience. “Too bureaucratic,” “not really interested in me,” and/or “not easy to talk with” represent the kinds of expression of student dissatisfaction often heard. In the many workshops, public admonishments and pleas, the associate dean (and the dean) have asked the faculty to take this type of advising more seriously and to care about the full scope of a student’s experience, in and out of the classroom, in residential life, etc. And in all these conversations, the deans (if not the faculty as well) have assumed that the principal term of the conversation, i.e., “advising” was defined the same by all parties. However, a recent conversation between the dean and a faculty member may suggest otherwise:

Professor: I don’t feel comfortable going into all that with advisees. I’m just not equipped to deal with the kinds of problems that’ll come forward.

Dean: But this undermines exactly one of the reasons that the small school experience is attractive to folks. No one’s asking you to be a professional counselor, but you’ve got to know how to refer students to a counselor if they’re needed. No one’s asking you to adopt anyone as a ward!

Professor: It just doesn’t make sense to me. I’m not equipped to do this. It makes me uncomfortable. I’ll help with course registration but I can’t do much more than that.

Dean: But we’re not simply in the business of signing off on course registrations and calling that good mentoring.

Professor: Oh, with mentoring it’s something different. I’m really comfortable with the students I mentor about talking about just about anything. I know about their lives, more than I should and I don’t mind giving advice. But my advisees … well, I just help them register for courses.

The remarks of this professor suggested something new -- that she viewed mentoring as something other than advising.  This caused the dean to make similar inquiries of other faculty and to discover that this was a widely held view.

I don’t mean to suggest that cognitive anthropology will be significantly enriched by the discovery of the structure of this segment of someone’s semantic domain; but it certainly holds the potential for re-orienting the discussion of advising in terms that all parties will understand.

3. But the Dean said….  On a more serious note, there is nothing more difficult than finding yourself in a situation where you are depicted, or you appear, to have reneged or withdrawn support after faculty have gone to great effort to work on a project. In these cases, what I have learned is that the key conversations in my office are often informed by background assumptions and definitions of the situation that are not shared; further I have learned that not making those different assumptions and definitions explicit leads -- guaranteed -- to misunderstanding and disaster.

Here again, what is involved is understanding the role that common implicit understandings have on the interaction itself. In a small, face-to-face, and traditionally governed campus such as ours, faculty culture includes a strong notion about the dean’s “approval” of a project. The academy is a world of ideas, many of them wonderful. Faculty come to me with ideas -- for special projects, initiatives, department programs, new emphases. In many cases, they want my response before proceeding. I have learned through examples too painful to describe here that I must take great care to distinguish my view that a project is “interesting” from what can be taken as a formal approval, to make clear what is worth talking about more and what has a “green light” to proceed.

My focus has not been the great achievements and goals on which decanal leadership, if not reputation, is often based. It’s not that I don’t have these goals (or actually realized some of them). But at least at a small college, or my little village, to be precise, much of what I do involves talking, listening, facilitating, intervening, coordinating -- all processes that require constant attention to both the multiple perspectives of different constituencies and the different, often tacit, assumptions they bring to their work and, especially, their interaction with each other. The ethnographic life is clearly not limited to field sites.

To be honest, it is easier to reflect upon all this than it is to live it.

On some days, I’ll confess it’s not clear whether I am part of the problem or part of the solution. On those days, I try to spend as much time as possible stapling and collating -- and stay out of harms’ way.

.

Author/s: 
Lawrence B. Breitborde
Author's email: 
info@insidehighered.com

Lawrence B. Breitborde is dean of the college and professor of anthropology at Knox College. This essay is adapted from a talk he gave at the 2006 annual meeting of the American Anthropological Association. The examples in this essay are composites, reflecting a range of experiences at his campus and elsewhere, and are designed to shield the identities of those involved.

Be Aware (Beware)

Not all Islamophobes are fanatics. Most, on the contrary, are decent people who just want to live in peace. Islamophobia forms only part of their identity. They grew up fearing Islam, and they still worry about it from time to time, especially during holidays and on certain anniversaries; but many would confess to doubt about just how Islamophobic they feel deep down inside. They may find themselves wondering, for example, if the Koran is really that much more bloodthirsty than the Jewish scriptures (Joshua 6 is plenty murderous) or the Christian (Matthew 10:34 is not exactly comforting).

Unfortunately a handful of troublemakers thrive among them, parasitically. They spew out hatred through Web sites. They seek to silence their critics, and to recruit impressionable young people. Perhaps it is unfair to confuse matters through calling the moderates and the militants by the same name. It would be more fitting to say that the latter are really Islamophobofascists.

Some might find the expression offensive. That is too bad. If we don’t resist Islamophobofascism now, its intolerance can only spread. And we all know who benefits from that. One name in particular comes to mind. It belongs to a fellow who is now presumably living in a cave, drawing up long-term plans for a clash of civilizations.....

Maybe I had better trim the satirical sails before going totally out to sea. As neologisms go, “Islamophobofascism” probably sounds even more stupid than the term it mocks. But there is a point to it.

“Islamofascism” is a noxious and counterproductive term -- a bludgeon disguised as an idea. Its use comes at a cost, even beyond the obvious one that goes with making people dumber. “Islamofascism” is the preferred term of those who don’t see any distinction between Al Qaeda, the Iranian mullahs, and the Baathists. Guess what? They are different, which might just have been worth understanding a few years ago. (Better late than never, maybe; but not a whole lot better.)

The more serious consequence, over the long term, is that of offering deliberate insult to those Muslims who would be put to the sword under the reign of Jihadi fundamentalists. Disgust for cheap stunts done in the name of “Islamofascism awareness” is not a matter of doubting that the jihadis mean what they say. On the contrary, it goes with taking them seriously as enemies.

It should not be necessary to qualify that last point. Somebody who wants to kill you is your enemy, whether you care to think in such terms or not; and the followers of Bin Laden, while subtle on some matters, have a least not been shy about letting us know what methods they consider permissible in pursuit of their ends. The jihadis mean it. Recognizing this is not a matter of Islamophobia; it is a matter of paying attention.

And paying attention means, in this case, recognizing that most Muslims are not our enemies. It is disgraceful to have to spell that out. But let’s be clear about something: The jihadis are not our only problem. As anyone from abroad who likes and respects Americans will probably tell you, we tend to be our own worst enemy.

There is a strain of nativism, xenophobia, and small-mindedness in American life that is always there -- often subdued, but never too far out of earshot. To call this our fascist streak would be absurdly melodramatic. Fascism proper was, above all, purposeful and orderly, while fear and loathing towards the “un-American” is often enough the woolliest form of baffled resentment: the effect of comfortable ignorance turning sour at any demand on its meager resources of attention and sympathy.

This quality can subsist for long periods in a dormant or distracted state -- expressing itself in muttering or small-scale acts of hostility, but nothing large-scale. Perhaps it is restrained by the better angels of our nature.

But it means that the unscrupulous and the obtuse have a ready supply of raw material to mold into something vile when the occasion becomes available, or if there is some profit in it. H.L. Mencken explained that a demagogue is “one who will preach doctrines he knows to be untrue to men he knows to be idiots." The problem with this definition, of course, is that it is the product of a simpler era and so not nearly cynical enough. For a demagogue now, truth and knowledge have nothing to do with it.

For the really suave expression of Islamophobofascism, however, no local sideshow can compete with an interview that the British novelist Martin Amis gave last year. At the highest stages of cosmopolitan literary influence, it seems, one may express ideas worthy of a manic loon phoning a radio talk-show and get them published in the London Times.

“There’s a definite urge -- don’t you have it? -- to say, ‘The Muslim community will have to suffer until it gets its house in order,’ ” Amis said. “What sort of suffering? Not letting them travel. Deportation -- further down the road. Curtailing of freedoms. Strip-searching people who look like they’re from the Middle East or from Pakistan.… Discriminatory stuff, until it hurts the whole community and they start getting tough with their children.”

The cultural theorist Terry Eagleton issued a response to Amis in the preface to a new edition of his book “Ideology: An Introduction” -- first published in 1991 by Verso, which reissued it a few weeks ago. It stirred up a tiny tempest in the British press, which reduced the argument to the dimensions of a clash between two “bad boys” (albeit ones grown quite long in the tooth).

Quickly mounting to impressive heights of inanity, the coverage and commentary managed somehow to ignore the actual substance of the dispute: what Amis said (his explicit call to persecute all Muslims until they acted right) and how Eagleton responded.

“Joseph Stalin seems not to be Amis’s favorite historical character,” wrote Eagleton, alluding to the novelist’s Koba the Dread, a venture into Soviet political history published a while back. “Yet there is a good dose of Stalinism in the current right-wing notion that a spot of rough stuff may be justified by the end in view. Not just roughing up actual or intending criminals, mind, but the calculated harassment of a whole population. Amis is not recommending such tactics for criminals or suspects only; he is recommending them as a way of humiliating and insulting certain kinds of men and women at random, so they will return home and teach their children to be nice to the White Man. There seems to be something mildly defective about this logic.”

Eagleton’s introduction doesn’t underestimate the virulence of the jihadists. But his remarks do at least have the good sense to acknowledge that humiliation is a weapon that will not work in the long run. (As an aside, let me note that some of us don't have the luxury of either ignoring terrorism or regarding it as something that will be abated by a more aggressive posture in the world. Life in Washington, D.C., for the past several years has meant rarely getting on the subway without wondering if this might be the day. The "surge" did not reduce the faint background radiation of dread one little bit. Funny how these thing work out, or don't.)

Anybody with an ounce of brains and responsibility can tell that fostering an environment of hysteria is useful only to one side of this conflict.“The best way to preserve one’s values,” writes Eagleton, “is to practice them.” Well said; and worth keeping in mind whenever the Islamophobofascists start to rush about, trying to drum up some business.

We shouldn't regard them as just nuisances. They are something much more dangerous. Determined to turn the whole world against us, they act as sleeper cells of malice and stupidity. There are sober ways to respond to danger, and insane ways. It is the demagogue’s stock in trade to blur the distinction.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

It's All Geek to Me

We have, by contemporary standards, a mixed marriage, for I am a nerd, while my wife is a geek. A good thing no kids are involved; we’d argue about how to raise them.

As a nerd, my bias is towards paper-and-ink books, and while I do indeed use information technology, asking a coherent question about how any of it works is evidently beyond me. A geek, by contrast, knows source code....has strong opinions about source code....can talk to other geeks about source code, and at some length. (One imagines them doing so via high-pitched clicking noises.) My wife understands network protocols. I think that Network Protocols would be a pretty good name for a retro-‘90s dance band.

This is more than a matter of temperament. It is a cultural difference that makes a difference. The nerd/geek divide manifested itself at the recent meeting of the Association of American University Presses, for example. Most people in scholarly publishing are nerds. But they feel like people now want them to become geeks, and this is not an expectation likely to yield happiness.

Christopher M. Kelty’s Two Bits: The Cultural Significance of Free Software, just published in dead-tree format by Duke University Press, might help foster understanding between the tribes. The book itself is available for free online. (The author also contributes to the popular academic group-blog Savage Minds.)

Kelty, an assistant professor of anthropology at Rice University, has done years of fieldwork among geeks, but Two Bits is not really a work of ethnography. Instead of describing geek life at the level of everyday experience or identity-shaping rituals, Kelty digs into the history and broader implications of one core element of geek identity and activity: the question of “open source” or “free” software. Those terms are loaded, and not quite equivalent, even if the nuance tends to be lost on outsiders. At issue, in either case, is not just the availability to users of particular programs, but full access to their inner workings – so that geeks can tinker, experiment, and invent new uses.

The expression “Free Software,” as Kelty capitalizes it, has overtones of a social movement, for which openness and transparency are values that can be embedded in technology itself, and then spread throughout institutions that use it. By contrast, the slightly older usage “open source” tends to be used when the element of openness is seen as a “development methodology” that is pragmatically useful without necessarily having major consequences. Both terms have been around since 1998. The fact that they are identical in reference yet point to a substantial difference of perspective is important. “It was in 1998-99,” writes Kelty, “that geeks came to recognize that they were all doing the same thing and, almost immediately, to argue about it.”

Much of Two Bits is devoted to accounts of how such arguments unfolded amidst the development of particular digital projects, with the author as a participant observer in one of them, Connexions (an online resource for the collaborative production of textbooks and curricular materials, previously discussed here). A merely nerdish reader may find some of this tough going. But the upshot of Two Bits is that geekery has constituted itself – through freeware, or whatever you want to call it – as what Kelty calls a “recursive” public sphere, with important implications for cultural life outside its borders.

Any strong notion of a public sphere is going to see the public as, in Kelty’s words, “a collective that asserts itself as a check on other constituted forms of power – like states, the church, and corporations – but which remains independent of those domains of power.”

The hard question, most of the time, is whether or not such a public actually exists. The journalist and social thinker Walter Lippmann considered the health of the public in three books he wrote during the 1920s, each volume gloomier than the last. And when Jurgen Habermas revisited the concept in the early 1960s, he concluded that the public sphere as a space of debate and rational deliberation had been at its most robust in the 18th century. More recently, Americans have made a hit out of a game show called “Are Your Smarter than a Fifth Grader?” in which adult contestants routinely prove that they are not, in fact, smarter than a fifth grader. All things considered, the idea of the public as a force that “asserts itself as a check on other constituted forms of power .... but which remains independent of those domains of power” does not seem to have much traction.

But geekdom (in Kelty’s analysis anyway) fosters a much more engaged ethos than that associated with earlier forms of mass media. This is not simply a matter of the well-known penchant for libertarianism in the tech world, about which there is probably not much new worth saying. (If consenting adults want to talk about Ayn Rand, that’s OK as long as I don’t have to listen.) Rather, the whole process of creating and distributing free software is itself, to borrow a programming term, recursive.

Per the OED, recursivity involves “a repeated procedure such that the required result at each step except the last is given in terms of the result(s) of the next step, until ... a terminus is reached with an outright evaluation of the result.”

Something like that dynamic – the combination of forward motion, regressive processing, and cumulative feedback – is found in geekdom’s approach to collaboration and evaluation. The discussions involved are not purely technical, but involve arguments over questions of transparency and ethical implication of software.

“A recursive public,” writes Kelty, “is a public that is vitally concerned with the material and practical maintenance and modification of the technical, legal, practical, and conceptual means of its own existence as a public; it is a collective independent of other forms of constituted power and is capable of speaking to existing forms of power through the production of actually existing alternatives.” (Those alternatives take the form of technology that the rest of us use, whether we understand it or not.)

Two Bits is an effort to analyze the source code, so to speak, of geekdom itself. How the larger culture interacts with it, and is shaped by it, is a subject for another study. Or for quite a few of them, rather, in due course. For now, I think Kelty’s book deserves a wide readership -- especially among nerds trying to make sense of the past decade, let alone to prepare for the next one.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Fear and Humiliation as Legitimate Teaching Methods

A psychoanalytically inclined friend of mine once told me that you can tell the important dreams not because you know what they mean, but because you can't get them out of your head. As an anthropologist I've noticed something similar about ethnographic fieldwork: You live through moments that immediately seem important to you, but it is only after chewing them over that you realize why. I had one such moment recently that taught me, deep down, that I firmly believe in the power of fear and humiliation as teaching methods. This insight came to me late last month in the course of having my ass kicked repeatedly by Kael'thas Sunstrider, son of Anasterian, prince of Quel'Thalas, and servant of Kil'jaeden the Deceiver.

This high valuation of fear and humiliation is not the sort of thing that you hear at the pep talks organized at your campus teaching and learning center. Perhaps this is not surprising given the non-traditional subject which provoked it. I study people who play World of Warcraft. Warcraft is one of the world's most popular videogames, home to over 10 million people who enter its high-fantasy world to become murloc-slaying gnomes and felsteel-smelting blacksmiths.

As players slay monsters and explore dungeons their characters progress, become more powerful, and develop an inventory of every more powerful gear. There are lots of things you can do in-game, from player-versus-player battle fields reminiscent of arcade game shoot ‘em ups to obsessive hoarding of gold earned by, for instance, picking rare herbs and selling them to players.

People play Warcraft for many reasons, but the guild that I am studying plays it to raid. Four times a week we get a posse of 25 people together to spend four hours to explore the most inaccessible, difficult dungeons in the game, find computer-controlled “bosses” of ever-increasing difficulty, and slay them. Of all of the things to do in World of Warcraft, raiding is the hardest and most intense. It requires powerful characters and careful planning. Of the 10 million people who play Warcraft, 9 million have have never even stepped foot inside the places we have been, much less kicked the ass of the bad guys that we found there. We have a Web site, we have headsets, and we are serious. I don't study “the video game as genre.” I study the way American cultures of teamwork and achievement shape online interaction. As an observer my mind boggles at the 20-80 hours my guildies spend in-game every week. As a participant I'm super proud of our accomplishments.

Enough exposition. In late September our target was Kael'thas Sunstrider, the blood elf prince who broods in the floating Naaru citadel of Tempest Keep. The fight against Kael is legendary for its intricacy: First the raid must defeat each of his four advisors in turn. Then his arsenal of magic weapons must be overcome and turned against the advisors, who Kael resurrects. Finally the raid has the opportunity to fight Kael and his pet phoenix. In the final stage of the fight, the raid must struggle to down Kael as he removes the gravity from the room and leaves the raid hanging, literally, in mid-air. Whole guilds have broken up in rancorous self-hatred after struggling unsuccessfully to down him.

Recently we tried to get some help by inviting to our raid members of another guild, which had already downed Kael. Almost immediately I could see why its members were successful -- their raid leader did not pull his punches. In the middle of fight I would hear him saying things like "Xibby, don't think I don't see you healing melee -- please do your job and focus on the tank." At times -- like when our Paladin failed repeatedly to engage Thaladred the Darkener, who responded by repeatedly blowing up our warlocks -- voices were raised.

I was impressed by their professionalism, their commitment to high standards, and their leader's willingness to call people out when they made mistakes, but most of my guildmates didn't feel that way when we chatted after the raid in our online guild chat.

"i’m sorry but my husband dosen’t curse at me and no guy on wow will either" said Darkembrace, a shadowpriest who was also a stay-at-home mom in Virginia with a 3 year old daughter and a 75 pound rottweiler in the IM discussion.

"yeah," said our 18 year old tree druid Algernon, summing up the mood succinctly. "fuk them please never invite them back lol"

That raid passed into the guild's collective memory without further ado but, like an important dream, it kept running through my head. I had always known that raiding is a form of learning. It takes weeks of time and dozens of deaths before a guild-first boss kill, and even more time until a boss is so routinely killable that he is, as we say, “on farm.” But it wasn't until those Kael attempts that I realized just how similar raiding and teaching are.

A 25-person raid is the same size as a class, and like a class its leader can only take it to places places that it is willing to go. Teaching, like learning to down a boss, is about helping people grow their comfort zone by getting them to spend time outside of it. The question is how to push people so that they will be ready to learn, instead of ready to tear their hair out.

Raiding has taught me that being a good teacher requires laying down strict guidelines while simultaneously demonstrating real care for your students. The stronger the ties of trust and respect between teacher and student, the more weight they will bear. In the past I've cringed when my raid leaders cheerfully announced that we would spend the next four hours dying over, and over, and over again to a boss who seemed impossible to defeat. But I've trusted them, done my job, and ultimately we have triumphed because they insisted on perseverance. The visiting raid leader who took us through the Kael raid lacked that history with us -- he was too much of a stranger to ask us to dig deep and give big.

A willingness to take risks can also be shored up by commitment and drive. Our guest leader drove my guildies nuts, but impressed me with his professionalism. Does this mean that after graduate school even generous doses of sadism seem unremarkable? Perhaps. But it also indicates that I was willing to work hard to see Kael dead, even if it meant catching some flack. For them, it was a game, and when it stopped being fun they lost interest.

What I learned that night was that I believe in the power of fear and humiliation as teaching methods. Obviously, I don't think they are teaching methods that should be used often, or be at the heart of our pedagogy. But I do think that there are occasions when it is appropriate to let people know that there is no safety net. There are times -- not all the time, or most of the time, but occasionally and inevitably -- when you have to tell people to shut up and do their job. I’m not happy to discover that I believe this, and in some ways I wish I didn’t. But Warcraft has taught me that I there is a place for "sink or swim" methods in teaching.

We never did get Kael down. Shortly after our shared guild run the powers that rule the World of Warcraft decided that the Kael fight was too hard and have "nerfed" it -- made him lighter, fluffier, and easier to kill. We’re headed back in on Thursday, but our victory now seems as hollow as it will be inevitable. My guildies will take the nerf and love it, because burning down a boss that used to wipe them out will make them feel like gods. To me it will be a disappointment, because their pleasure in victory will be proof that we were never willing to do what we had to in order to become the kind of people who didn’t need the nerf.

Teaching is about empowering students, and Warcraft has taught me that there is a difference between being powerful and feeling powerful. We had a chance to grow as a guild, but in the end we just couldn't hack it. In the course of all this I learned that I am a person whose believes that there are some things in life too important for us to give up just because achieving them might make us uncomfortable.

Anthropologists love to tell stories of their emotional communion with the people they study. This story ends on a darker note, because what I learned from my attempts to kill Kael'thas Sunstrider was that I was not the same kind of person as my guildies -- a fact made even more disconcerting by the fact that we are supposed to be members of the "same" culture. My fieldwork has not taught me to find commonality across cultures, but to see diversity within my own. Playing Warcraft has taught me that I have a dark side when it comes to pedagogy which I wish I didn't have -- I’ve realized that a seam of commitment that surfaced in one place in my biography lies hidden in another. Does this mean my guildies need to care more, or that I need to learn to care less? It’s a question that I try not to ask, because I’m afraid I might not like the answer.

Author/s: 
Alex Golub
Author's email: 
info@insidehighered.com

Alex Golub is an assistant professor of anthropology at the University of Hawaii at Manoa who blogs at Savage Minds.

The Relevance of the Humanities

The deepening economic crisis has triggered a new wave of budget cuts and hiring freezes at America’s universities. Retrenchment is today’s watchword. For scholars in the humanities, arts and social sciences, the economic downturn will only exacerbate existing funding shortages. Even in more prosperous times, funding for such research has been scaled back and scholars besieged by questions concerning the relevance of their enterprise, whether measured by social impact, economic value or other sometimes misapplied benchmarks of utility.

Public funding gravitates towards scientific and medical research, with its more readily appreciated and easily discerned social benefits. In Britain, the fiscal plight of the arts and humanities is so dire that the Institute of Ideas recently sponsored a debate at King’s College London that directly addressed the question, “Do the arts have to re-brand themselves as useful to justify public money?”

In addition to decrying the rising tide of philistinism, some scholars might also be tempted to agree with Stanley Fish, who infamously asserted that humanities “cannot be justified except in relation to the pleasure they give to those who enjoy them.” Fish rejected the notion that the humanities can be validated by some standard external to them. He dismissed as wrong-headed “measures like increased economic productivity, or the fashioning of an informed citizenry, or the sharpening of moral perception, or the lessening of prejudice and discrimination.”

There is little doubt that the value of the humanities and social sciences far outstrip any simple measurement. As universities and national funding bodies face painful financial decisions and are forced to prioritize the allocation of scarce resources, however, scholars must guard against such complacency. Instead, I argue, scholars in the social sciences, arts, and humanities should consider seriously how the often underestimated value of their teaching and research could be further justified to the wider public through substantive contributions to today’s most pressing policy questions.

This present moment is a propitious one for reconsidering the function of academic scholarship in public life. The election of a new president brings with it an unprecedented opportunity for scholars in the humanities and social sciences. The meltdown of the financial markets has focused public attention on additional challenges of massive proportions, including the fading of American primacy and the swift rise of a polycentric world.

Confronting the palpable prospect of American decline will demand contributions from all sectors of society, including the universities, the nation’s greatest untapped resource. According to the Times Higher Education Supplement’s recently released rankings, the U.S. boasts 13 of the world’s top 20 universities, and 36 U.S. institutions figure in the global top 100. How can scholars in the arts, humanities and social sciences make a difference at this crucial historical juncture? How can they demonstrate the public benefits of their specialist research and accumulated learning?

A report published by the British Academy in September contains some valuable guidance. It argues that the collaboration between government and university researchers in the social sciences and humanities must be bolstered. The report, “Punching Our Weight: the Humanities and Social Sciences in Public Policy Making” emphasizes how expanded contact between government and humanities and social science researchers could improve the effectiveness of public programs. It recommends “incentivizing high quality public policy engagement.” It suggests that universities and public funding bodies should “encourage, assess and reward” scholars who interact with government. The British Academy study further hints that university promotion criteria, funding priorities, and even research agendas should be driven, at least in part, by the major challenges facing government.

The British Academy report acknowledges that “there is a risk that pressure to develop simplistic measures will eventually lead to harmful distortions in the quality of research,” but contends that the potential benefits outweigh the risks.

The report mentions several specific areas where researchers in the social sciences and humanities can improve policy design, implementation, and assessment. These include the social and economic challenges posed by globalization; innovative comprehensive measurements of human well-being; understanding and predicting human behavior; overcoming barriers to cross-cultural communication; and historical perspectives on contemporary policy problems.

The British Academy report offers insights that the U.S. government and American scholars could appropriate. It is not farfetched to imagine government-university collaboration on a wide range of crucial issues, including public transport infrastructure, early childhood education, green design, civil war mediation, food security, ethnic strife, poverty alleviation, city planning, and immigration reform. A broader national conversation to address the underlying causes of the present crisis is sorely needed. By putting their well-honed powers of perception and analysis in the public interest, scholars can demonstrate that learning and research deserve the public funding and esteem which has been waning in recent decades.

The active collaboration of scholars with government will be anathema to those who conceive of the university as a bulwark against the ever encroaching, nefarious influence of the state. The call for expanded university-government collaboration may provoke distasteful memories of the enlistment of academe in the service of the Cold War and the Vietnam War, a relationship which produced unedifying intellectual output and dreadfully compromised scholarship.

To some degree, then, skepticism toward the sort of government-university collaboration advocated here is fully warranted by the specter of the past. Moreover, the few recent efforts by the federal government to engage with researchers in the social sciences and humanities have not exactly inspired confidence.

The Pentagon’s newly launched Minerva Initiative, to say nothing of the Army’s much-criticized Human Terrain System, has generated a storm of controversy, mainly from those researchers who fear that scholarship will be placed in the service of war and counter-insurgency in Iraq and Afghanistan and produce ideologically distorted scholarship.

Certainly, the Minerva Initiative’s areas of funded research -- “Chinese military and technology studies, Iraqi and Terrorist perspective projects, religious and ideological studies," according to its Web site -- raise red flags for many university-based researchers. Yet I would argue that frustration with the Bush administration and its policies must not preclude a dispassionate analysis of the Minerva Initiative and block recognition of its enormous potential for fostering and deepening links between university research and public policy communities. The baby should not be thrown out with the bathwater. The Minerva Initiative, in a much-reformed form, represents a model upon which future university-government interaction might be built.

Cooperation between scholars in the social sciences and humanities and all of the government’s departments should be enhanced by expanding the channels of communication among them. The challenge is to establish a framework for engagement that poses a reduced threat to research ethics, eliminates selection bias in the applicant pool for funding, and maintains high scholarly standards. Were these barriers to effective collaboration overcome, it would be exhilarating to contemplate the proliferation of a series of “Minerva Initiatives” in various departments of the executive branch. Wouldn’t government policies and services -- in areas as different as the environmental degradation, foreign aid effectiveness, health care delivery, math and science achievement in secondary schools, and drug policy -- improve dramatically were they able to harness the sharpest minds and cutting-edge research that America’s universities have to offer?

What concrete forms could such university-government collaboration take? There are several immediate steps that could be taken. First, it is important to build on existing robust linkages. The State Department and DoD already have policy planning teams that engage with scholars and academic scholarship. Expanding the budgets as well as scope of these offices could produce immediate benefits.

Second, the departments of the executive branch of the federal government, especially Health and Human Services, Education, Interior, Homeland Security, and Labor, should devise ways of harnessing academic research on the Minerva Initiative model. There must be a clear assessment of where research can lead to the production of more effective policies. Special care must be taken to ensure that the scholarly standards are not adversely compromised.

Third, universities, especially public universities, should incentivize academic engagement with pressing federal initiatives. It is reasonable to envision promotion criteria modified to reward such interaction, whether it takes the form of placements in federal agencies or the production of policy relevant, though still rigorous, scholarship. Fourth, university presidents of all institutions need to renew the perennial debate concerning the purpose of higher education in American public life. Curricula and institutional missions may need to align more closely with national priorities than they do today.

The public’s commitment to scholarship, with its robust tradition of analysis and investigation, must extend well beyond the short-term needs of the economy or exigencies imposed by military entanglements. Academic research and teaching in the humanities, arts and social sciences plays a crucial role in sustaining a culture of open, informed debate that buttresses American democracy. The many-stranded national crisis, however, offers a golden opportunity for broad, meaningful civic engagement by America’s scholars and university teachers. The public benefits of engaging in the policy-making process are, potentially, vast.

Greater university-government cooperation could reaffirm and make visible the public importance of research in the humanities, arts and social sciences.

Not all academic disciplines lend themselves to such public engagement. It is hard to imagine scholars in comparative literature or art history participating with great frequency in such initiatives.

But for those scholars whose work can shed light on and contribute to the solution of massive public conundrums that the nation faces, the opportunity afforded by the election of a new president should not be squandered. Standing aloof is an unaffordable luxury for universities at the moment. The present conjuncture requires enhanced public engagement; the stakes are too high to stand aside.

Author/s: 
Gabriel Paquette
Author's email: 
doug.lederman@insidehighered.com

Gabriel Paquette is a lecturer in the history department at Harvard University.

The Hope of Audacity

I am sick of reading about Malcolm Gladwell’s hair.

Sure, The New Yorker writer has funny hair. It has been big. Very big. It is audacious hair, hair that dares you not to notice it; hair that has been mentioned in far too many reviews. Malcolm Gladwell’s hair is its own thing.

Which is only appropriate, since in his writing, Gladwell has always gone his own way. But he’s been doing it long enough, and so well, and has made so much money, that some folks feel it’s time to trim him down to size. That hair is now seen as uppity.

Gladwell is a mere journalist. He’s not shy, and like many children of academics, he is not intimidated by eggheads. He does none of his own primary research, and instead scours academic journals to find interesting ideas -- he collects experiments and experimenters. He is a translator and a synthesizer, and comes up with catchy, sprightly titled theories to explain what he has seen. Some have called him a parasite. He has called himself a parasite.

It seems to me there’s always been a bit of snarkiness attached to discussions of Gladwell’s work. This is often the case for books that have become commercially successful, which is something that seems particularly to stick in the collective academic craw. There is a weird hostility in the reviews of Gladwell’s books that is directed not at the big-haired guy himself who, like a puppy, nips at the heels of academics and then relishes the opportunity to render their work into fluid, transparent prose, but toward those many people who have made Gladwell famous: his readers. No one matches the caustic condescension of Richard Posner, who said, in a review of Gladwell’s Blink, that “it’s a book for people who don’t read books.”

The reviews of Outliers, Gladwell’s latest book, show that even a New Yorker writer can go too far. People are now attacking Malcolm Gladwell as a kind of brand. The critiques boil down to a few things, one of which is that he doesn’t take into account evidence that refutes his theories. In other words, he’s not doing careful scholarship. But we all know that even careful scholarship is a game of picking and choosing -- it just includes more footnotes acknowledging this. And Gladwell never pretends to be doing scholarship.

Gladwell is also accused of being too entertaining. He takes creaky academic work and breathes Frankensteinian life into it. He weaves anecdotes together, creating a tapestry that builds to an argument that seems convincing. This, some reviewers have claimed, is like perpetuating fraud on the (non-academic) reading public: because Gladwell makes it so much fun to follow him on his intellectual journey, he’s going to convince people of things that aren’t provably, academically true. He will lull the hoi polloi into thinking they’re reading something serious.

Which is, of course, the most common complaint about Gladwell: He’s not serious enough. He’s having too much fun playing with his ideas. And, really, you can’t be Serious when you’re raking in so much coin. Anyone who gets paid four million bucks for a book that mines academic work -- and not necessarily the stuff that is agreed to be Important -- is going to become a target. His speaking fees are beyond the budgets of most colleges. In this way, his career is now similar to that of David Sedaris, who can command an impressive audience and still be dissed by the literary folks. Everyone who’s anyone knows that you can’t sell a lot of books and be a serious writer. Just ask Jonathan Franzen. Or Toni Morrison.

I don’t see Gladwell as a social scientist-manqué, or a philosopher wannabe. Instead, I read him more like an essayist. I think of his books as well-written, research-packed, extended essays. Let me show you the evils of imperialism by telling you a story about the time in Burma when I was forced to shoot an elephant. Let’s look at this (bad) academic prose and think about the relationship between politics and the English language. But instead of using his own experiences, he builds on work done by others. He uses a wry, quirky approach and blithely ignores the received wisdom and pieties of academe. He doesn’t seek out the researcher who’s highly regarded within her field; he looks for people who are doing things he finds interesting.

Gladwell reminds me of the kind of student I knew in college, the nerd who takes weird and arcane courses and then rushes from the lecture hall excited about some idea the professor has mentioned in passing and goes straight to the library to pursue it himself. He stays up all night talking about it, and convincing you that even though you were in the same class, and heard the same reference, you have somehow missed something. Maybe not something big, but at least something really, really cool.

Perhaps I have more trust in readers than to believe that they can be so easily bought off by a good story. And I wish that academics, instead of pillorying Gladwell for being good at translating complicated ideas, would study the way he does it and apply some portion of his method to their own work: He makes mini trade books of monographs. Surely this is a lesson worth learning. He uses the narrative art of the magazine writer to animate ideas. He profiles theories the way Gay Talese or Joan Didion did celebrities.

The audacity Gladwell shows in his writing, connecting seemingly disparate things and working hard, yet with apparent effortlessness, to make the ideas engaging, gives me hope for the future of books. It makes me feel better to see folks buying Gladwell rather than the swimmer Michael Phelps’s memoir or vampire novels -- not that there’s anything wrong with that. Yet this same audacity is what gets Gladwell into hot water with academics. He’s not supposed to do this.

Unless you are an aged physicist, you don’t really get to write books that “purport to explain the world.” You can, of course, try to explicate tiny portions of it. Science writers like James Gleick and Jonathan Weiner can go a lot further than most scientists in terms of making arcane principles understandable to the Joe the Plumbers of the reading world and no one gets bent of out shape. Perhaps it’s because of the assumption that scientists, with a few notable (often British) exceptions, are not supposed to be able to write books that normal people can read. Social scientists and historians are, however, expected to be able to know what is interesting and important about their work and present it to the public. Brand name thinkers like Susan Sontag and Martha Nussbaum can take on big ideas. But these people are experts; journalists shouldn’t try this at home.

What I love about Gladwell is that his writing is like his hair. You can see it as arrogant or scary (he writes about being stopped more frequently by cops when he had a big afro), or you can see it as playful and audacious. This is why, of course, so many reviews mention it; he has the right hair for his work.

One final, dour complaint about Gladwell has to do with his relentless cheeriness. He thinks that people are basically good, though he understands that sometimes circumstances aren’t. I can’t abide high-brow literary novelists who trash fiction that “cops out” with a happy ending. Maybe I’m hopelessly low-brow: I still love Jane Austen and Shakespeare’s comedies. The academic response to most things is generally: it’s more complicated than that. And sure, much of the time it is. But if something’s artfully crafted, I’m willing to cut the author some slack. I don’t ever expect to be thoroughly persuaded of anything; I’m characterologically skeptical and like to do the thinking on my own. Gladwell’s books invite me into a conversation. I think that’s part of the job of a good book.

For me, reading Malcolm Gladwell’s books is like watching Frank Capra movies. Just because they make you feel good and keep you entertained doesn’t mean that they’re not doing valuable work or tackling hard and real issues and ideas. Sure, someone else could have handled it differently. George Bailey might have finally committed suicide; the bank in Bedford Falls could have asked for a government bailout. But right now, maybe it’s not such a bad thing to read books that are a little more hopeful. And yes, audacious.

Author/s: 
Rachel Toor
Author's email: 
newsroom@insidehighered.com

Rachel Toor teaches in the MFA program at Eastern Washington University. She writes a monthly column for The Chronicle of Higher Education, and her most recent book is Personal Record: A Love Affair With Running. Her Web site is www.racheltoor.com.

Pages

Subscribe to RSS - Anthropology
Back to Top