Anthropology

Jewish in Polynesia

As a Jewish professor, I know that it is my lot in life to deal with stereotypes of Jewish academics. As a Jewish professor from California, dealing with these stereotypes is even more difficult because I lack recourse to the solution favored by many colleagues: acting as if the complex negotiation of my identity can be accomplished simply by assuming that "Jewish" means "from New York" and leaving it at that. As a Jewish professor from California who teaches in Hawaii, navigating my identity as a practicing Reform Jew, both in the classroom and out, has taken many surprising twists and turns.

Oxford University Press's Judaism: A Very Short Introduction notes astutely that Jews, like tomatoes, are neither "particularly complicated or obscure when left to themselves, but they don't neatly fit into the handy categories such as fruit or vegetable or nation and religion which are so useful for pigeonholing other foods and people." Growing up in northern California, I went to a high school where the blanket term "Asian" was scrupulously decomposed into a wide variety of ethnicities, which included not just Chinese, Japanese, and Korean, but Hmong, Miao, Mien, Lao, Hongkie, Taiwanese, and so forth. When I got lunch at Vang's convenience store, my Thai friend grumbled about "those hill people." But for him, as for me, there was only one kind of white person: the white kind.

It was not until I moved to graduate school in Chicago that I realized that there were different kinds of white people. Growing up in Reagan's America, "Marxism" to me meant a critique of the soullessness of suburban life. Exploitation was not about class -- it was about Mexican-Anglo relations. While I understood that my religion made me different from most people, it didn't seem to make me any more distinctive than the guy in my class whose family had a time-share in Tahoe: I missed a few days of class for high holy days, he missed them for the time share. But living on the south side of Chicago, class became an inescapable fact of life, and "color" meant "black" and "white."

That I could understand. But I was particularly puzzled by religion as a source of social differentiation in America. I traveled to Minnesota and visited small towns, which featured intersections with churches on every corner. Why did the Missouri Synod Lutherans need one church and the Wisconsin Synod Lutherans need another? And how was all this related to the graduate student parties where bizarre passes would occasionally be made at me by women whose complex psychological relationship with novels like Portnoy's Complaint and Ravelstein had driven them out of their dairy-rich farming communities and into the arms of a cosmopolitan intellectualism which they expected me to embody?

My dissertation committee consisted of three Jewish structuralists and a Protestant interested in performativity. The Protestant member of my committee claimed that reading Kierkegaard's analysis of the sacrifice of Isaac through a Derridean lens could help explain nationalism in Indonesia, but this was the closest I actually got to Judaism as a religious phenomenon. Actually that is not true. At one point as I was driving in the car with one member of my committee, she pointed out a kosher butcher shop and told me that that was where another member of my committee went "for really good meat." But that was it -- my committee was alarmed when I suggested that Judaism was not actually synonymous with being an atheist intellectual, or even who knew where to get a pound of lean pastrami.

I originally felt my move to Hawaii would be a sort of homecoming -- a return to the multicultural environment of my childhood and an end to the terrible, terrible cold I had suffered through in the Midwest. In fact I was in for a bit of a shock. Hawaii has a unique local culture derived from the state's legacy of plantation colonialism and its overthrow at the hands of a strong labor movement. As a result Hawaii owes much to the Japanese, Chinese, and Portuguese workers who moved here to cut cane. And of course there is the rich tradition of native Hawaiian culture, which has experienced a renaissance here in the past 30 years. Since the United States has long been the inheritor of Spanish colonialism in the Pacific, our islands are called home by increasing numbers of Chammoro and Filipinos. The growing number of migrants from Samoa and Tonga allows Hawaii to challenge Auckland as the unofficial capital of Polynesia. And there is no doubt that Honolulu -- the forward point for the projection of U.S. political and military power into the Pacific -- has long been a center for Micronesian migration.

Further, Hawaii has one of the highest rates of intermarriage in the country, and the place is remarkably cosmopolitan given its small size and distance from major centers. The result of all this is that my students are more likely to visit Saipan than Schenectady, and know more about Pago Pago than Paris. It soon became apparent that the welcome return to my natal relation to my Jewish identity was not to be had -- and for reasons more enduring than the fact that the Web site for my new shul in Honolulu was "shaloha.com."

A great deal of my Introduction to Cultural Anthropology course involves getting students to rethink ideas of race and ethnicity in light of the anthropological concept of culture. However, I feel very uncomfortable asking my students to objectify themselves in class by asking them "as an Asian, how do you feel about this?" or lecturing my African-American students about supposedly innate black athletic ability. On the mainland I solved this problem by objectifying myself and examining, for instance, stereotypes about Jews. In fact I typically use the tomato imagery from Oxford's Very Short Introduction to Judaism. "So," I said in my first class in the islands, dry erase marker in hand and ready to make a discussion-spurring list, "what are some stereotypes you have of Jewish people?"

Silence.

"Please," I say generously, "This classroom is a safe place where people can discuss controversial topics civilly, so don't feel you need to spare my feelings. So: what are some stereotypes of Jews?"

Silence. As a new professor I had read countless books and articles exhorting me not to get freaked out if it took a while for someone to say
something. But this time I get nothing. Nada.

"How about the idea that Jews are good with money?" I finally offer.

"You mean like they're pake?" asked one confused woman, using a
Hawaiian term originally meant to describe Chinese people, but which in
local slang simply means tight-fisted.

In this and other classes I quickly came to realize that when it comes to Jews the Hawaiian response to the question of "is a tomato a vegetable or a fruit" is to ask "what's a tomato?" In California, my identity as a Jew wasn't particularly relevant. In Honolulu, I am pretty much off the table insofar as the ethnic imagination of my students goes. All white people were haole -- a Hawaiian word with a slightly derogatory connotations (one of my students wears a T-shirt to class that reads "Haole you flew here I grew here".)

The problem was not just that my students didn't know that I was a tomato, they're often a little unclear on the idea that people must be sorted into fruits and vegetables. To put it another way: it is difficult to expose the culturally-contingent nature of your student's essentialist folk theories of identity when they have names like Motoko Kapualani da Silva or Brian Ka'imikaua Li. This latter student claimed to be "Japanese, Filipino, and Hawaiian." I pointed out to him that his last name was Chinese. He paused and thought about it for a second and then remembered that yes, his family was also Chinese but he had never really thought of "Li" as a Chinese name. By speaking frankly about my own identity with my students I learned that they did not operate with the same concepts of race and ethnicity that my students on the mainland did, and this insight allowed me to teach anthropology in a way that was accessible to them.

Now when I teach my intro class I engage my student's expectations about ethnic difference by approaching some of the aporias of identity in Hawaii. How does the selective retention of taboos and purity laws by orthodox Jews provide a model of how (or how not!) to creatively innovate one's tradition? Why do we speak of Hawaii as a multicultural paradise when there is so much racial tension simmering under the surface? Is the distinction of "local" and "haole" one of race? Of class? What does it mean to talk about "ancient Hawaiian tradition" with a professor whose people lived in diaspora for a millennium before the first Hawaiians arrived in the islands? Why are haole tourists noisy, rude, and overbearing compared to locals? How can we use the concept of culture to render this comportment intelligible?

In sum, living in Hawaii has forced me to rethink not just my own Jewish heritage, but  the issue of heritage in general. As an anthropologist I find the challenge of working through these multiple layers of identity and (as we say in the business) “group affiliation” to be both professionally and personally rewarding. And most important of all, it provides my students a chance to grow intellectually by thinking about identity and belonging in ways they may not have before. In the next few semesters I plan to put together a class entitled "Kohen and Kahuna," based on an unpublished manuscript by a Rabbi and sociologist who studied at the University of Hawaii in the 1940s. We will not discuss Woody Allen films or where to get a pound of lean pastrami. We will discuss the comparative study of taboo, social complexity in Polynesia and the ancient Middle East, and anxiety about not being able to chant properly in your heritage language. Shaloha, everyone!

Author/s: 
Alex Golub
Author's email: 
info@insidehighered.com

Alex Golub finished his dissertation in anthropology at the University of Chicago in 2005 and is now an adjunct professor at the University of Hawaii at Manoa. He blogs at Savage Minds, a group blog about cultural anthropology.

Christianity: You're Soaking in It

Inside Higher Ed's recent article on the prevalence of religion on campus came as no surprise to me. Although it has been about a month since the article came out, I still think that the time is ripe for me to make a confession: I actively incorporate the gospel of Christ into my teaching -- although not for the reason you might think.

I myself am not Christian -- as some readers may remember, I'm Jewish. I am, however, a passionate choral singer with an interest in music of the Baroque and Renaissance, and it is hard to find secular ensembles that perform this repertoire. As a result I spend a lot of time in church.

After I get done writing this, for instance, I'm off to my second gig of the week -- a compline service sung in candle light featuring a procession (often with handbells), the complete chanted Anglican rite for the office of compline, two anthems, a nunc dimittis, the office hymn (we have over 25 settings of te lucis ante terminum), an orison, and a psalm. The group itself is quite small -- 10 men (countertenors sing the treble parts) who rehearse without accompaniment in a 90-minute rehearsal before our performance.

I'm very proud that I have what it takes to sing in a group that operates at such a professional level. But singing in church means more to me than just pride in my musicianship -- it is part of the more general rapprochement I've had with Christianity. I went to a college where people were more likely to take acid than communion and my last visit to the Bay Area involved stopping at both anarchist communes and picking up equipment to broadcast an illegal radio channel from our car while road tripping up to Burning Man. So you see, attending church seems infinitely more transgressive of my socials norms than does, say, running around naked in the middle of the desert while dosed to the gills on synthetic mescaline.

This generally lefty background, combined with my religious background, means that Christianity was something that I only ever heard about from my friends, whose lifestyles were an elaborate form of rebellion against it. As a result it's been a bit of a surprise to me to discover that the religion had any redeeming features at all. But in fact my time as a chorister has given me the opportunity to meet Christians whose faith has led them to lives of remarkable compassion, caring, and integrity and to appreciate the power and value that the Christian faith has for its followers.

The other thing that singing in church makes you realize about Christians is that they're everywhere. When it first hit me, this revelation filled me with the same shock that fills those ladies in the old Palmolive commercials -- I'm soaking in it! These "palmolive" moments continued into the classroom and it soon became apparent to me that many of my students -- who looked perfectly normal -- actually considered Jesus Christ to be their personal savior.

I quickly noticed that being a Jewish church musician gave me something in common with my Christian students -- indeed, in some strange way I knew more about their religion than they did. This is because I am, like all church musicians, a liturgy junkie. My students think of the celebration we just observed as "Halloween," while for me it was the 20th Sunday after Pentecost. Many of my students know the Lord's Prayer by heart, but few of them can spontaneously spout the text of the Magnificat in both English and Latin. And while I occasionally get a student who knows that Easter is somehow tied to Passover, I've never encountered one who knew that Pentecost was actually Shavuot.

My decision to begin incorporating the anthropology of Christianity into my classes was premised on the belief that, academically speaking, Christianity could be used to soften hands while I did the dishes. That is to say, I realized that I didn't just have to let the fact that my classes were saturated with Christianity go unremarked. Rather than simply soak in it, I could use it to further the goals of the class. Even the fact that my students came from diverse faith backgrounds within and without Christianity could be foregrounded as a way of asking students to think through and share with each other exactly what their beliefs were.

The key, of course, is that the stance we take on Christianity in class be distanced and yet respectful. While I may feel that I'm soaking in it, Christian students see themselves to be an embattled minority in an increasingly secular society full of professors who belittle their beliefs in lectures on evolution and secular humanism. Beating up on my Christian students for their faith in the name of cultural relativism is simply not effective anthropology.

So while I have a gimlet eye for some of Christianity's more incongruous beliefs, I am someone who actively participates in the life of their faith community. I'm the guy who sings motets while everyone else takes communion -- in participant-observation in the classically anthropological sense. This sense of being both insider and outsider helps, I believe, to reassure students that our my analysis of Christianity is not meant to be a partisan exercise either for or against, but a demonstration of the power of social science to make taken-for-granted topics amenable to analysis.

The textbook in my "intro to anthro" class, for instance, has a chapter on the way symbolic action reinforces worldview through the use of compelling and culturally specific metaphors. It then takes examples of rituals from "other" cultures and demonstrates how these seemingly bizarre activities function, once you understand the metaphors at play within them. I, however, have given up teaching the Kwakiutl Cannibal Dance. Now I just teach communion -- my favorite Christian ritual after the procession on Palm Sunday. I begin by taking the belief of Christians that they can be made pure only through the cannibalistic consumption of their deity. How can we account for this belief?

I begin by having students explain what communion is to members of the class who are not familiar with it, and we pause to consider the special fact that practices within Christianity vary greatly from one church to another. This is, literally, anthropology 101: Cultural traditions are not internally homogeneous, but demonstrate a wide variation in practice ranging from Roman Catholics (who do believe themselves literally to be cannibals) to Lutherans (who hedge their bets with consubstantiation) to Mennonites (who may go their whole lives without taking communion).

Next, I begin slowly peeling away at the communion service, pointing out the metaphorical associations of consumption and identification which -- as in most Christian rituals -- derive their power by recreating in the here-and-now of the church an event from the there-and-then life of Jesus (I don't burden the students with technical terms like "metricalization of space" and "distal chronotope"). Students pick this up relatively quickly: In communion the priest is to Jesus at the Last Supper as the congregation is to the apostles, just as in the Palm Sunday processional, the priest becomes Jesus entering Jerusalem just as the congregation becomes the crowd welcoming his entrance into Jerusalem.

The original grounding event of the Last Supper thus becomes the source of a metaphorical identification. This is not the end of the matter, however, since the Last Supper itself is Jesus's own elaborate riff on the festival he was celebrating -- Passover. Passover itself is a here-and-now remembrance of the then-and-there event of the angel of death (creepily represented in The Ten Commandments by colored dry ice) passing over Hebrew houses marked with the blood of a lamb.

Having taken the students back to Exodus, we then begin working forward with the image of the blood of lambs, passing successively through pastoral imagery in the Hebrew Bible, prohibitions on the consumption of blood in Leviticus, and all the way forward again to the Roman Catholic liturgy in which God becomes not the shepherd who makes us to lie in green fields, but the lamb who takes away the sins of the world.

I typically wrap up by noting that these metaphors and identifications continue to circulate in our own culture and keep us "soaking" in Christianity. Recently, for instance, I've ended with the new Superman movie, which features a man sent by his omnipotent father to Earth who protects humanity using his supernatural powers, only to be defeated by The Adversary -- Kevin Spacey cum Lex Luthor on a gigantic island made of kryptonite -- but who rises again to triumph in glory and help Kate Bosworth quit smoking.

I think it is one of my better lectures of the semester. I can't take much credit for this fact, however, since the communion service is such a spectacularly well-designed ritual. And of course it's not like I figured all of this symbolism out -- it's a series of connections that Christian theologians have been quite articulate about.

Best of all, this discussion of communion leaves my students turned on to some of the central concerns of my discipline: What is the distinction between "the real" and "the cultural" in a situation where most Christians do not believe in transubstantiation and yet rely on tropes of incorporation (cannibalistically, when Jesus's body enters you, and communally, when this act of alimentation brings you into the "body" of the church) to give their rituals power? How do we characterize the awareness of a participant at a ritual event who "gets it" but may not be able to articulate the play of tropes they experience without their professor's help? What is the status of interpretive social science as a science if it consists (merely?) of re-presenting the knowledge of our informants in a new form?

And speaking of productive tropes -- what sort of metaphorical associations are invoked in my own lecture (a genre that in America at least has an unabashedly homiletic past) when I mobilize the then-and-now of the communion in my role as a maverick Jewish proselytizing for my own brand of human knowing to a classroom full of potential intellectual converts?

Years ago Gerald Graff argued that the best way to deal with the culture wars was to teach them. My own experience with Christianity, as atypical as it is, has led me to see the value of bringing it into the classroom and making it an issue with my students. Like a lot of worthwhile tasks, it's a tricky one. But I believe it's an important one and -- when done correctly -- fun as well. Above all, I think it is best to realize when it comes to religion in our schools, the issue is not necessarily what is being taught, but how it is being approached. After all, America is a country where we are "soaking in it" -- even if we are not as fully immersed as some would like.

Author/s: 
Alex Golub
Author's email: 
info@insidehighered.com

Alex Golub is an assistant professor of anthropology at the University of Hawaii at Manoa who blogs at Savage Minds. His last column was about joining the tenure track.

Don't Tell Me What I Said. I Know What I Meant

Talking and barriers to communication are not new topics for deans. We receive lots of opportunities for help with this aspect of our administrative role. All of them recognize, implicitly or explicitly, the central role of communication between deans and others. I receive notices for workshops and offers from consultants on “dealing with difficult employees,” on “how to deal with angry faculty members” (perhaps offer them coffee and/or jawbreakers to calm/slow them down….). There is a cottage industry of providers of higher education management advice regarding faculty-administrative relations.

I am in my 12th year as a dean at an independent residential liberal arts college, with approximately 1,300 students and over 100 full-time faculty members. My college places a high value on “community” as an institutional value and the significant faculty investment in the college. Similarly, while in principle, faculty know they can do their work anywhere, an institutional commitment (or institutional loyalty) vigorously competes with the disciplinary orientation of faculty. Faculty very quickly become embedded in courses, programs and friendships that link them with other faculty, often outside their department, and integrate them more strongly into the institution.

The small scale of my type of institution cannot be overemphasized. We are face-to-face everyday, whether in classrooms, in offices, in the snack bar or the gym. I work in a multiuse building, with some (not all) administrative offices on the main floor, but faculty and classrooms on the floors above. There are always students, faculty, and staff in the corridors. This small scale means that I am but a phone call or e-mail or shout across the lawn from most faculty, and that I am very much "their" dean. That is to say, responsible for anything on faculty minds. Beyond the formal responsibilities of personnel issues, reviews, curriculum, and so on, my perceived responsibilities and authority can include virtually anything relevant to faculty (objections to the ban on dogs on campus, disputes over the use of the whirlpool in fieldhouse, furor over certain trees being trimmed, the sense of insult created when the art department is not consulted on selecting the color of woodwork for a new building, etc.).

What are the day-to-day experiences like in a setting with the above characteristics?

Two fundamental principles of sociocultural anthropology play a role in the day-to-day experiences I have as dean.

The first can be found in the introductory chapters to most classic textbooks in sociocultural anthropology. Once referred to as the psychic unity of humankind, the more contemporary argument is, perhaps, along the following lines: Human beings are everywhere rational; yet particular rationalities are defined through and embedded in particular cultures. One of our principal goals as anthropologists is to discover the sense of some particular rationality, often one other than our own -- the sense that peoples’ lives make to them. That is to say, I make the assumption that faculty behavior makes sense, although at times I have to work to discover the sense that it makes -- and, of course, vice versa!

The second principle is not quite as old, but well-embedded in our field none-the-less. For me, as a sociolinguist, the version I’m most comfortable with comes from the work of the early ethnomethodologists in their studies of conversational structure and conversational interaction.

Harold Garfinkel, for example, writes in his now classic 1964 article, “Studies of the Routine Grounds for Everyday Activities” to bring our attention to the taken-for-granted scenes of “everyday” social life. He focuses on “common understandings” about the silent -- unstated -- common background understandings that give everyday life its “sense” and comprehensibility. He writes that: “…common understanding … consists … in the enforceable character of actions in compliance with the expectancies of everyday life as a morality.  Common sense knowledge of the facts of social life for the members of the society is institutionalized knowledge of the real world."

For example:

    Bill: Are you going to John’s party tonight?
    Al: I can’t face Karen.  (former girlfriend, now with John)
    Bill: You can come late.  (Karen will leave early to go to work, avoiding awkward contact)
 
Unstated common background information is the source of the coherence that link these statements logically to each other and give the event, the conversation, its coherence. And, importantly, the participants assume a coherence in the interaction; they expect each other to behave in sensible ways. It is this latter assumption of coherence that sends interlocutors “searching” for the sense that talk makes.

Both principles -- the psychic unity of humankind and the role of shared background information as a source of the coherence and efficacy of social interaction -- can be help us understand when interactions don’t work -- when there are breaches as well as when they do. Simply put, failing to take the other’s perspective can distort one’s understanding of their behavior, and create misperceptions of the motivations behind that behavior. Similarly conversations and interactions can have an illusory, and even misleading, coherence. The assumption of common background information can mask actual differences in those assumptions, including different objectives for the interaction itself, leading to several different realities experienced simultaneously and without recognition. Subsequent recognition of these differences often becomes exacerbated by the perception of the earlier “good” meeting; if it was a good meeting, and now something has happened inconsistent with that meeting, then someone can’t be trusted!

So can faculty members and administrators communicate? Can these (sub)cultures be bridged? Are we forever limited by our respective positions within the institution to experience a different, unique “reality”? 

I am guided in my response to this question by a lecture I recently attended by Zali Gurevitch, a visiting anthropologist, discussing the Israeli-Palestinian conflict. In his analysis, he referred to the difference between thick borders, where multiple cultures can be dealt with only by oppositions, or a multiculture which can be constructed by thin borders: neither X nor Y alone, but each is part of both, or something larger. I was inspired to think about differences within the college and the potential to integrate them into a larger coherent whole -- if persons would understand the points of view of others.

Here are two examples:

The tyranny of the techies. In an institutional planning committee meeting, participants were discussing the priority for a technology initiative. This led to expressions of frustration by faculty regarding available technology and frustration from the computer center staff about the lack of interest of many faculty in further technological innovations.

The technician complained that faculty don’t use the full potential of existing software. He argued that digital technology “is most powerful” in that it “changes (positively) the way we do things,” and complained that too many faculty simply wanted technology to do things in existing ways. For him, difficulties arise when a person is incompetent in the use of technology.

The faculty member argued that software and hardware often break down, keeping them for doing what they intend. “It’s not reliable. We want it to work.”

In these two comments, we can see very different approaches to computer technology. For the computer technician, faculty are forever disappointing the technology by not engaging with its potential. For the faculty member, technology is something we use and problems arise when the technology “breaks down.”

It wasn’t hard to get them on a different track by helping them understand each other’s point of view. And it was now possible to continue the conversation, enlightened by each other’s point of view and assumptions.

Why are we here? Staff are often ignored in those higher education discussions of (faculty vs. administrative) “cultures,” particularly middle-management, mid-level staff. It’s assumed that for most, it’s just a job, 8:00 a.m. – 4:00 p.m., with stipulated holidays, sick days, etc. In contrast, many faculty may continue to look at their work as a “calling” and a “way of life,” although this might be dismissed as nonsense by some. However, the small scale of the institution leads to nearly everyone’s direct and intensive contact with the primary constituency on our campus -- the students -- whose conversations, aspirations, and relations with staff often help those staff at least understand, if not embrace, the sense of community and common purpose to which we strive. The result is that a good proportion of the staff develops as much of a sense of “calling” as do some faculty -- although this may surprise those faculty.

A particularly dramatic piece of testimony came to me from a staff person who was reflecting on work at the college:

”Being here has lead to a great deal of personal growth, and it is truly the first place I've worked where I feel that I've been able to be a whole person with a well balanced life. [The college] has improved me and really given me the opportunity to change peoples lives... including my own. Everyone, at some point in their life, should spend a decade or more at a place like this. It's hard to pinpoint what "it" is that makes this place unique (and at times uniquely frustrating). Perhaps it's because we're trying to live up to an ideal ... rather than just run a business.  We're measuring ourselves beyond and aspiring to things that can't be reflected totally on a balance sheet. That's unique.”

All this suggests that although some of the subcultural differences may be real, they are not necessarily unbridgeable.  With some understanding that people inside the institution have perspectives, aspirations and perhaps even values that are shaped to an extent by the positions they occupy within the structure, it at least becomes possible to integrate those different perspectives into a larger whole and for each constituency to become informed by the perspectives of the others.

Don't Tell Me What I Said. I Know What I Meant

Misunderstandings that follow from seemingly coherent meetings, memos, and other events are bountiful in my world. One recent researcher on conversational misunderstandings has written:

“Whatever the trigger of a misunderstanding or the extent of our misunderstanding, misunderstandings constitute an ordinary feature of human communication. This amounts to relying on a model of communication in which understanding is not granted … communication is not a matter of replication or duplication of thoughts but rather, it entails a model of transformation and interpretation…”

I have never allowed myself to be surprised by the level of creativity, transformation and interpretation of events in which I have played a part and which I thought I understood. And what I rediscover time and time again is the need to find the shared assumptions that underlie conversation in my office.

Here are some examples:

1. Professors are not private investigators. The science division faculty provided the campus safety office with a list of students allowed after-hours entry to the building to work on laboratory projects. One day a student whose name was on that list complained to their biology advisor that a campus safety officer forced them to leave the building after-hours. The next day, the biology professor sent an e-mail to the campus safety director complaining about this -- a reasonable complaint. The campus safety director responded with an e-mail asking the professor to provide him with information on the time of the incident, where it occurred, and exactly what was said. In turn, the professor responded, with a very angry e-mail, stating that he was insulted with this request, that it was not his job to do investigations, that this was not “a productive use of [his] time”, and that the campus safety officer should simply do what he needed to ensure this doesn’t happen again.

[This example is, of course, exacerbated by the use of e-mail -- although the fact that it was all copied to me, piece by piece, raises a topic for some other paper on academic culture which one day I will write: that is, the need for an “audience” in person-to-person e-mails.]

I intervened, concerned at the vitriolic tone of the professor’s e-mail. I spoke with the director of security. There was no question that he wanted to correct his officer’s mistake but he needed more information. Based on his experience that students were sometimes reluctant to respond to his direct inquiries, and based on his understanding of the close faculty/student relationships, he thought the easiest and fastest way to get the information -- and get the problem solved -- would be for the faculty member to get the information from the student.

The director was not wrong-headed in his assumptions. He did not consider, however, that his request could be interpreted in other ways, namely, as asking someone else to do what should be his own work. The misunderstanding could have been avoided by the director asking for the information OR suggesting that the professor to put the student in contact with the campus safety officer.

2. Advising? Oh, that’s different. For several years, the college has been trying to improve the quality of advising. Here, as at most small schools, all advising is done by continuing faculty members. Upon enrollment, students are assigned an initial advisor based on area of academic interest; upon declaring a major (usually toward the end of the sophomore year, majors switch advisors to someone requested from their major department.

The director of academic advising is an associate dean of the college who hears regularly from students who are dissatisfied with their pre-major advising experience. “Too bureaucratic,” “not really interested in me,” and/or “not easy to talk with” represent the kinds of expression of student dissatisfaction often heard. In the many workshops, public admonishments and pleas, the associate dean (and the dean) have asked the faculty to take this type of advising more seriously and to care about the full scope of a student’s experience, in and out of the classroom, in residential life, etc. And in all these conversations, the deans (if not the faculty as well) have assumed that the principal term of the conversation, i.e., “advising” was defined the same by all parties. However, a recent conversation between the dean and a faculty member may suggest otherwise:

Professor: I don’t feel comfortable going into all that with advisees. I’m just not equipped to deal with the kinds of problems that’ll come forward.

Dean: But this undermines exactly one of the reasons that the small school experience is attractive to folks. No one’s asking you to be a professional counselor, but you’ve got to know how to refer students to a counselor if they’re needed. No one’s asking you to adopt anyone as a ward!

Professor: It just doesn’t make sense to me. I’m not equipped to do this. It makes me uncomfortable. I’ll help with course registration but I can’t do much more than that.

Dean: But we’re not simply in the business of signing off on course registrations and calling that good mentoring.

Professor: Oh, with mentoring it’s something different. I’m really comfortable with the students I mentor about talking about just about anything. I know about their lives, more than I should and I don’t mind giving advice. But my advisees … well, I just help them register for courses.

The remarks of this professor suggested something new -- that she viewed mentoring as something other than advising.  This caused the dean to make similar inquiries of other faculty and to discover that this was a widely held view.

I don’t mean to suggest that cognitive anthropology will be significantly enriched by the discovery of the structure of this segment of someone’s semantic domain; but it certainly holds the potential for re-orienting the discussion of advising in terms that all parties will understand.

3. But the Dean said….  On a more serious note, there is nothing more difficult than finding yourself in a situation where you are depicted, or you appear, to have reneged or withdrawn support after faculty have gone to great effort to work on a project. In these cases, what I have learned is that the key conversations in my office are often informed by background assumptions and definitions of the situation that are not shared; further I have learned that not making those different assumptions and definitions explicit leads -- guaranteed -- to misunderstanding and disaster.

Here again, what is involved is understanding the role that common implicit understandings have on the interaction itself. In a small, face-to-face, and traditionally governed campus such as ours, faculty culture includes a strong notion about the dean’s “approval” of a project. The academy is a world of ideas, many of them wonderful. Faculty come to me with ideas -- for special projects, initiatives, department programs, new emphases. In many cases, they want my response before proceeding. I have learned through examples too painful to describe here that I must take great care to distinguish my view that a project is “interesting” from what can be taken as a formal approval, to make clear what is worth talking about more and what has a “green light” to proceed.

My focus has not been the great achievements and goals on which decanal leadership, if not reputation, is often based. It’s not that I don’t have these goals (or actually realized some of them). But at least at a small college, or my little village, to be precise, much of what I do involves talking, listening, facilitating, intervening, coordinating -- all processes that require constant attention to both the multiple perspectives of different constituencies and the different, often tacit, assumptions they bring to their work and, especially, their interaction with each other. The ethnographic life is clearly not limited to field sites.

To be honest, it is easier to reflect upon all this than it is to live it.

On some days, I’ll confess it’s not clear whether I am part of the problem or part of the solution. On those days, I try to spend as much time as possible stapling and collating -- and stay out of harms’ way.

.

Author/s: 
Lawrence B. Breitborde
Author's email: 
info@insidehighered.com

Lawrence B. Breitborde is dean of the college and professor of anthropology at Knox College. This essay is adapted from a talk he gave at the 2006 annual meeting of the American Anthropological Association. The examples in this essay are composites, reflecting a range of experiences at his campus and elsewhere, and are designed to shield the identities of those involved.

Be Aware (Beware)

Not all Islamophobes are fanatics. Most, on the contrary, are decent people who just want to live in peace. Islamophobia forms only part of their identity. They grew up fearing Islam, and they still worry about it from time to time, especially during holidays and on certain anniversaries; but many would confess to doubt about just how Islamophobic they feel deep down inside. They may find themselves wondering, for example, if the Koran is really that much more bloodthirsty than the Jewish scriptures (Joshua 6 is plenty murderous) or the Christian (Matthew 10:34 is not exactly comforting).

Unfortunately a handful of troublemakers thrive among them, parasitically. They spew out hatred through Web sites. They seek to silence their critics, and to recruit impressionable young people. Perhaps it is unfair to confuse matters through calling the moderates and the militants by the same name. It would be more fitting to say that the latter are really Islamophobofascists.

Some might find the expression offensive. That is too bad. If we don’t resist Islamophobofascism now, its intolerance can only spread. And we all know who benefits from that. One name in particular comes to mind. It belongs to a fellow who is now presumably living in a cave, drawing up long-term plans for a clash of civilizations.....

Maybe I had better trim the satirical sails before going totally out to sea. As neologisms go, “Islamophobofascism” probably sounds even more stupid than the term it mocks. But there is a point to it.

“Islamofascism” is a noxious and counterproductive term -- a bludgeon disguised as an idea. Its use comes at a cost, even beyond the obvious one that goes with making people dumber. “Islamofascism” is the preferred term of those who don’t see any distinction between Al Qaeda, the Iranian mullahs, and the Baathists. Guess what? They are different, which might just have been worth understanding a few years ago. (Better late than never, maybe; but not a whole lot better.)

The more serious consequence, over the long term, is that of offering deliberate insult to those Muslims who would be put to the sword under the reign of Jihadi fundamentalists. Disgust for cheap stunts done in the name of “Islamofascism awareness” is not a matter of doubting that the jihadis mean what they say. On the contrary, it goes with taking them seriously as enemies.

It should not be necessary to qualify that last point. Somebody who wants to kill you is your enemy, whether you care to think in such terms or not; and the followers of Bin Laden, while subtle on some matters, have a least not been shy about letting us know what methods they consider permissible in pursuit of their ends. The jihadis mean it. Recognizing this is not a matter of Islamophobia; it is a matter of paying attention.

And paying attention means, in this case, recognizing that most Muslims are not our enemies. It is disgraceful to have to spell that out. But let’s be clear about something: The jihadis are not our only problem. As anyone from abroad who likes and respects Americans will probably tell you, we tend to be our own worst enemy.

There is a strain of nativism, xenophobia, and small-mindedness in American life that is always there -- often subdued, but never too far out of earshot. To call this our fascist streak would be absurdly melodramatic. Fascism proper was, above all, purposeful and orderly, while fear and loathing towards the “un-American” is often enough the woolliest form of baffled resentment: the effect of comfortable ignorance turning sour at any demand on its meager resources of attention and sympathy.

This quality can subsist for long periods in a dormant or distracted state -- expressing itself in muttering or small-scale acts of hostility, but nothing large-scale. Perhaps it is restrained by the better angels of our nature.

But it means that the unscrupulous and the obtuse have a ready supply of raw material to mold into something vile when the occasion becomes available, or if there is some profit in it. H.L. Mencken explained that a demagogue is “one who will preach doctrines he knows to be untrue to men he knows to be idiots." The problem with this definition, of course, is that it is the product of a simpler era and so not nearly cynical enough. For a demagogue now, truth and knowledge have nothing to do with it.

For the really suave expression of Islamophobofascism, however, no local sideshow can compete with an interview that the British novelist Martin Amis gave last year. At the highest stages of cosmopolitan literary influence, it seems, one may express ideas worthy of a manic loon phoning a radio talk-show and get them published in the London Times.

“There’s a definite urge -- don’t you have it? -- to say, ‘The Muslim community will have to suffer until it gets its house in order,’ ” Amis said. “What sort of suffering? Not letting them travel. Deportation -- further down the road. Curtailing of freedoms. Strip-searching people who look like they’re from the Middle East or from Pakistan.… Discriminatory stuff, until it hurts the whole community and they start getting tough with their children.”

The cultural theorist Terry Eagleton issued a response to Amis in the preface to a new edition of his book “Ideology: An Introduction” -- first published in 1991 by Verso, which reissued it a few weeks ago. It stirred up a tiny tempest in the British press, which reduced the argument to the dimensions of a clash between two “bad boys” (albeit ones grown quite long in the tooth).

Quickly mounting to impressive heights of inanity, the coverage and commentary managed somehow to ignore the actual substance of the dispute: what Amis said (his explicit call to persecute all Muslims until they acted right) and how Eagleton responded.

“Joseph Stalin seems not to be Amis’s favorite historical character,” wrote Eagleton, alluding to the novelist’s Koba the Dread, a venture into Soviet political history published a while back. “Yet there is a good dose of Stalinism in the current right-wing notion that a spot of rough stuff may be justified by the end in view. Not just roughing up actual or intending criminals, mind, but the calculated harassment of a whole population. Amis is not recommending such tactics for criminals or suspects only; he is recommending them as a way of humiliating and insulting certain kinds of men and women at random, so they will return home and teach their children to be nice to the White Man. There seems to be something mildly defective about this logic.”

Eagleton’s introduction doesn’t underestimate the virulence of the jihadists. But his remarks do at least have the good sense to acknowledge that humiliation is a weapon that will not work in the long run. (As an aside, let me note that some of us don't have the luxury of either ignoring terrorism or regarding it as something that will be abated by a more aggressive posture in the world. Life in Washington, D.C., for the past several years has meant rarely getting on the subway without wondering if this might be the day. The "surge" did not reduce the faint background radiation of dread one little bit. Funny how these thing work out, or don't.)

Anybody with an ounce of brains and responsibility can tell that fostering an environment of hysteria is useful only to one side of this conflict.“The best way to preserve one’s values,” writes Eagleton, “is to practice them.” Well said; and worth keeping in mind whenever the Islamophobofascists start to rush about, trying to drum up some business.

We shouldn't regard them as just nuisances. They are something much more dangerous. Determined to turn the whole world against us, they act as sleeper cells of malice and stupidity. There are sober ways to respond to danger, and insane ways. It is the demagogue’s stock in trade to blur the distinction.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

It's All Geek to Me

We have, by contemporary standards, a mixed marriage, for I am a nerd, while my wife is a geek. A good thing no kids are involved; we’d argue about how to raise them.

As a nerd, my bias is towards paper-and-ink books, and while I do indeed use information technology, asking a coherent question about how any of it works is evidently beyond me. A geek, by contrast, knows source code....has strong opinions about source code....can talk to other geeks about source code, and at some length. (One imagines them doing so via high-pitched clicking noises.) My wife understands network protocols. I think that Network Protocols would be a pretty good name for a retro-‘90s dance band.

This is more than a matter of temperament. It is a cultural difference that makes a difference. The nerd/geek divide manifested itself at the recent meeting of the Association of American University Presses, for example. Most people in scholarly publishing are nerds. But they feel like people now want them to become geeks, and this is not an expectation likely to yield happiness.

Christopher M. Kelty’s Two Bits: The Cultural Significance of Free Software, just published in dead-tree format by Duke University Press, might help foster understanding between the tribes. The book itself is available for free online. (The author also contributes to the popular academic group-blog Savage Minds.)

Kelty, an assistant professor of anthropology at Rice University, has done years of fieldwork among geeks, but Two Bits is not really a work of ethnography. Instead of describing geek life at the level of everyday experience or identity-shaping rituals, Kelty digs into the history and broader implications of one core element of geek identity and activity: the question of “open source” or “free” software. Those terms are loaded, and not quite equivalent, even if the nuance tends to be lost on outsiders. At issue, in either case, is not just the availability to users of particular programs, but full access to their inner workings – so that geeks can tinker, experiment, and invent new uses.

The expression “Free Software,” as Kelty capitalizes it, has overtones of a social movement, for which openness and transparency are values that can be embedded in technology itself, and then spread throughout institutions that use it. By contrast, the slightly older usage “open source” tends to be used when the element of openness is seen as a “development methodology” that is pragmatically useful without necessarily having major consequences. Both terms have been around since 1998. The fact that they are identical in reference yet point to a substantial difference of perspective is important. “It was in 1998-99,” writes Kelty, “that geeks came to recognize that they were all doing the same thing and, almost immediately, to argue about it.”

Much of Two Bits is devoted to accounts of how such arguments unfolded amidst the development of particular digital projects, with the author as a participant observer in one of them, Connexions (an online resource for the collaborative production of textbooks and curricular materials, previously discussed here). A merely nerdish reader may find some of this tough going. But the upshot of Two Bits is that geekery has constituted itself – through freeware, or whatever you want to call it – as what Kelty calls a “recursive” public sphere, with important implications for cultural life outside its borders.

Any strong notion of a public sphere is going to see the public as, in Kelty’s words, “a collective that asserts itself as a check on other constituted forms of power – like states, the church, and corporations – but which remains independent of those domains of power.”

The hard question, most of the time, is whether or not such a public actually exists. The journalist and social thinker Walter Lippmann considered the health of the public in three books he wrote during the 1920s, each volume gloomier than the last. And when Jurgen Habermas revisited the concept in the early 1960s, he concluded that the public sphere as a space of debate and rational deliberation had been at its most robust in the 18th century. More recently, Americans have made a hit out of a game show called “Are Your Smarter than a Fifth Grader?” in which adult contestants routinely prove that they are not, in fact, smarter than a fifth grader. All things considered, the idea of the public as a force that “asserts itself as a check on other constituted forms of power .... but which remains independent of those domains of power” does not seem to have much traction.

But geekdom (in Kelty’s analysis anyway) fosters a much more engaged ethos than that associated with earlier forms of mass media. This is not simply a matter of the well-known penchant for libertarianism in the tech world, about which there is probably not much new worth saying. (If consenting adults want to talk about Ayn Rand, that’s OK as long as I don’t have to listen.) Rather, the whole process of creating and distributing free software is itself, to borrow a programming term, recursive.

Per the OED, recursivity involves “a repeated procedure such that the required result at each step except the last is given in terms of the result(s) of the next step, until ... a terminus is reached with an outright evaluation of the result.”

Something like that dynamic – the combination of forward motion, regressive processing, and cumulative feedback – is found in geekdom’s approach to collaboration and evaluation. The discussions involved are not purely technical, but involve arguments over questions of transparency and ethical implication of software.

“A recursive public,” writes Kelty, “is a public that is vitally concerned with the material and practical maintenance and modification of the technical, legal, practical, and conceptual means of its own existence as a public; it is a collective independent of other forms of constituted power and is capable of speaking to existing forms of power through the production of actually existing alternatives.” (Those alternatives take the form of technology that the rest of us use, whether we understand it or not.)

Two Bits is an effort to analyze the source code, so to speak, of geekdom itself. How the larger culture interacts with it, and is shaped by it, is a subject for another study. Or for quite a few of them, rather, in due course. For now, I think Kelty’s book deserves a wide readership -- especially among nerds trying to make sense of the past decade, let alone to prepare for the next one.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Fear and Humiliation as Legitimate Teaching Methods

A psychoanalytically inclined friend of mine once told me that you can tell the important dreams not because you know what they mean, but because you can't get them out of your head. As an anthropologist I've noticed something similar about ethnographic fieldwork: You live through moments that immediately seem important to you, but it is only after chewing them over that you realize why. I had one such moment recently that taught me, deep down, that I firmly believe in the power of fear and humiliation as teaching methods. This insight came to me late last month in the course of having my ass kicked repeatedly by Kael'thas Sunstrider, son of Anasterian, prince of Quel'Thalas, and servant of Kil'jaeden the Deceiver.

This high valuation of fear and humiliation is not the sort of thing that you hear at the pep talks organized at your campus teaching and learning center. Perhaps this is not surprising given the non-traditional subject which provoked it. I study people who play World of Warcraft. Warcraft is one of the world's most popular videogames, home to over 10 million people who enter its high-fantasy world to become murloc-slaying gnomes and felsteel-smelting blacksmiths.

As players slay monsters and explore dungeons their characters progress, become more powerful, and develop an inventory of every more powerful gear. There are lots of things you can do in-game, from player-versus-player battle fields reminiscent of arcade game shoot ‘em ups to obsessive hoarding of gold earned by, for instance, picking rare herbs and selling them to players.

People play Warcraft for many reasons, but the guild that I am studying plays it to raid. Four times a week we get a posse of 25 people together to spend four hours to explore the most inaccessible, difficult dungeons in the game, find computer-controlled “bosses” of ever-increasing difficulty, and slay them. Of all of the things to do in World of Warcraft, raiding is the hardest and most intense. It requires powerful characters and careful planning. Of the 10 million people who play Warcraft, 9 million have have never even stepped foot inside the places we have been, much less kicked the ass of the bad guys that we found there. We have a Web site, we have headsets, and we are serious. I don't study “the video game as genre.” I study the way American cultures of teamwork and achievement shape online interaction. As an observer my mind boggles at the 20-80 hours my guildies spend in-game every week. As a participant I'm super proud of our accomplishments.

Enough exposition. In late September our target was Kael'thas Sunstrider, the blood elf prince who broods in the floating Naaru citadel of Tempest Keep. The fight against Kael is legendary for its intricacy: First the raid must defeat each of his four advisors in turn. Then his arsenal of magic weapons must be overcome and turned against the advisors, who Kael resurrects. Finally the raid has the opportunity to fight Kael and his pet phoenix. In the final stage of the fight, the raid must struggle to down Kael as he removes the gravity from the room and leaves the raid hanging, literally, in mid-air. Whole guilds have broken up in rancorous self-hatred after struggling unsuccessfully to down him.

Recently we tried to get some help by inviting to our raid members of another guild, which had already downed Kael. Almost immediately I could see why its members were successful -- their raid leader did not pull his punches. In the middle of fight I would hear him saying things like "Xibby, don't think I don't see you healing melee -- please do your job and focus on the tank." At times -- like when our Paladin failed repeatedly to engage Thaladred the Darkener, who responded by repeatedly blowing up our warlocks -- voices were raised.

I was impressed by their professionalism, their commitment to high standards, and their leader's willingness to call people out when they made mistakes, but most of my guildmates didn't feel that way when we chatted after the raid in our online guild chat.

"i’m sorry but my husband dosen’t curse at me and no guy on wow will either" said Darkembrace, a shadowpriest who was also a stay-at-home mom in Virginia with a 3 year old daughter and a 75 pound rottweiler in the IM discussion.

"yeah," said our 18 year old tree druid Algernon, summing up the mood succinctly. "fuk them please never invite them back lol"

That raid passed into the guild's collective memory without further ado but, like an important dream, it kept running through my head. I had always known that raiding is a form of learning. It takes weeks of time and dozens of deaths before a guild-first boss kill, and even more time until a boss is so routinely killable that he is, as we say, “on farm.” But it wasn't until those Kael attempts that I realized just how similar raiding and teaching are.

A 25-person raid is the same size as a class, and like a class its leader can only take it to places places that it is willing to go. Teaching, like learning to down a boss, is about helping people grow their comfort zone by getting them to spend time outside of it. The question is how to push people so that they will be ready to learn, instead of ready to tear their hair out.

Raiding has taught me that being a good teacher requires laying down strict guidelines while simultaneously demonstrating real care for your students. The stronger the ties of trust and respect between teacher and student, the more weight they will bear. In the past I've cringed when my raid leaders cheerfully announced that we would spend the next four hours dying over, and over, and over again to a boss who seemed impossible to defeat. But I've trusted them, done my job, and ultimately we have triumphed because they insisted on perseverance. The visiting raid leader who took us through the Kael raid lacked that history with us -- he was too much of a stranger to ask us to dig deep and give big.

A willingness to take risks can also be shored up by commitment and drive. Our guest leader drove my guildies nuts, but impressed me with his professionalism. Does this mean that after graduate school even generous doses of sadism seem unremarkable? Perhaps. But it also indicates that I was willing to work hard to see Kael dead, even if it meant catching some flack. For them, it was a game, and when it stopped being fun they lost interest.

What I learned that night was that I believe in the power of fear and humiliation as teaching methods. Obviously, I don't think they are teaching methods that should be used often, or be at the heart of our pedagogy. But I do think that there are occasions when it is appropriate to let people know that there is no safety net. There are times -- not all the time, or most of the time, but occasionally and inevitably -- when you have to tell people to shut up and do their job. I’m not happy to discover that I believe this, and in some ways I wish I didn’t. But Warcraft has taught me that I there is a place for "sink or swim" methods in teaching.

We never did get Kael down. Shortly after our shared guild run the powers that rule the World of Warcraft decided that the Kael fight was too hard and have "nerfed" it -- made him lighter, fluffier, and easier to kill. We’re headed back in on Thursday, but our victory now seems as hollow as it will be inevitable. My guildies will take the nerf and love it, because burning down a boss that used to wipe them out will make them feel like gods. To me it will be a disappointment, because their pleasure in victory will be proof that we were never willing to do what we had to in order to become the kind of people who didn’t need the nerf.

Teaching is about empowering students, and Warcraft has taught me that there is a difference between being powerful and feeling powerful. We had a chance to grow as a guild, but in the end we just couldn't hack it. In the course of all this I learned that I am a person whose believes that there are some things in life too important for us to give up just because achieving them might make us uncomfortable.

Anthropologists love to tell stories of their emotional communion with the people they study. This story ends on a darker note, because what I learned from my attempts to kill Kael'thas Sunstrider was that I was not the same kind of person as my guildies -- a fact made even more disconcerting by the fact that we are supposed to be members of the "same" culture. My fieldwork has not taught me to find commonality across cultures, but to see diversity within my own. Playing Warcraft has taught me that I have a dark side when it comes to pedagogy which I wish I didn't have -- I’ve realized that a seam of commitment that surfaced in one place in my biography lies hidden in another. Does this mean my guildies need to care more, or that I need to learn to care less? It’s a question that I try not to ask, because I’m afraid I might not like the answer.

Author/s: 
Alex Golub
Author's email: 
info@insidehighered.com

Alex Golub is an assistant professor of anthropology at the University of Hawaii at Manoa who blogs at Savage Minds.

The Relevance of the Humanities

The deepening economic crisis has triggered a new wave of budget cuts and hiring freezes at America’s universities. Retrenchment is today’s watchword. For scholars in the humanities, arts and social sciences, the economic downturn will only exacerbate existing funding shortages. Even in more prosperous times, funding for such research has been scaled back and scholars besieged by questions concerning the relevance of their enterprise, whether measured by social impact, economic value or other sometimes misapplied benchmarks of utility.

Public funding gravitates towards scientific and medical research, with its more readily appreciated and easily discerned social benefits. In Britain, the fiscal plight of the arts and humanities is so dire that the Institute of Ideas recently sponsored a debate at King’s College London that directly addressed the question, “Do the arts have to re-brand themselves as useful to justify public money?”

In addition to decrying the rising tide of philistinism, some scholars might also be tempted to agree with Stanley Fish, who infamously asserted that humanities “cannot be justified except in relation to the pleasure they give to those who enjoy them.” Fish rejected the notion that the humanities can be validated by some standard external to them. He dismissed as wrong-headed “measures like increased economic productivity, or the fashioning of an informed citizenry, or the sharpening of moral perception, or the lessening of prejudice and discrimination.”

There is little doubt that the value of the humanities and social sciences far outstrip any simple measurement. As universities and national funding bodies face painful financial decisions and are forced to prioritize the allocation of scarce resources, however, scholars must guard against such complacency. Instead, I argue, scholars in the social sciences, arts, and humanities should consider seriously how the often underestimated value of their teaching and research could be further justified to the wider public through substantive contributions to today’s most pressing policy questions.

This present moment is a propitious one for reconsidering the function of academic scholarship in public life. The election of a new president brings with it an unprecedented opportunity for scholars in the humanities and social sciences. The meltdown of the financial markets has focused public attention on additional challenges of massive proportions, including the fading of American primacy and the swift rise of a polycentric world.

Confronting the palpable prospect of American decline will demand contributions from all sectors of society, including the universities, the nation’s greatest untapped resource. According to the Times Higher Education Supplement’s recently released rankings, the U.S. boasts 13 of the world’s top 20 universities, and 36 U.S. institutions figure in the global top 100. How can scholars in the arts, humanities and social sciences make a difference at this crucial historical juncture? How can they demonstrate the public benefits of their specialist research and accumulated learning?

A report published by the British Academy in September contains some valuable guidance. It argues that the collaboration between government and university researchers in the social sciences and humanities must be bolstered. The report, “Punching Our Weight: the Humanities and Social Sciences in Public Policy Making” emphasizes how expanded contact between government and humanities and social science researchers could improve the effectiveness of public programs. It recommends “incentivizing high quality public policy engagement.” It suggests that universities and public funding bodies should “encourage, assess and reward” scholars who interact with government. The British Academy study further hints that university promotion criteria, funding priorities, and even research agendas should be driven, at least in part, by the major challenges facing government.

The British Academy report acknowledges that “there is a risk that pressure to develop simplistic measures will eventually lead to harmful distortions in the quality of research,” but contends that the potential benefits outweigh the risks.

The report mentions several specific areas where researchers in the social sciences and humanities can improve policy design, implementation, and assessment. These include the social and economic challenges posed by globalization; innovative comprehensive measurements of human well-being; understanding and predicting human behavior; overcoming barriers to cross-cultural communication; and historical perspectives on contemporary policy problems.

The British Academy report offers insights that the U.S. government and American scholars could appropriate. It is not farfetched to imagine government-university collaboration on a wide range of crucial issues, including public transport infrastructure, early childhood education, green design, civil war mediation, food security, ethnic strife, poverty alleviation, city planning, and immigration reform. A broader national conversation to address the underlying causes of the present crisis is sorely needed. By putting their well-honed powers of perception and analysis in the public interest, scholars can demonstrate that learning and research deserve the public funding and esteem which has been waning in recent decades.

The active collaboration of scholars with government will be anathema to those who conceive of the university as a bulwark against the ever encroaching, nefarious influence of the state. The call for expanded university-government collaboration may provoke distasteful memories of the enlistment of academe in the service of the Cold War and the Vietnam War, a relationship which produced unedifying intellectual output and dreadfully compromised scholarship.

To some degree, then, skepticism toward the sort of government-university collaboration advocated here is fully warranted by the specter of the past. Moreover, the few recent efforts by the federal government to engage with researchers in the social sciences and humanities have not exactly inspired confidence.

The Pentagon’s newly launched Minerva Initiative, to say nothing of the Army’s much-criticized Human Terrain System, has generated a storm of controversy, mainly from those researchers who fear that scholarship will be placed in the service of war and counter-insurgency in Iraq and Afghanistan and produce ideologically distorted scholarship.

Certainly, the Minerva Initiative’s areas of funded research -- “Chinese military and technology studies, Iraqi and Terrorist perspective projects, religious and ideological studies," according to its Web site -- raise red flags for many university-based researchers. Yet I would argue that frustration with the Bush administration and its policies must not preclude a dispassionate analysis of the Minerva Initiative and block recognition of its enormous potential for fostering and deepening links between university research and public policy communities. The baby should not be thrown out with the bathwater. The Minerva Initiative, in a much-reformed form, represents a model upon which future university-government interaction might be built.

Cooperation between scholars in the social sciences and humanities and all of the government’s departments should be enhanced by expanding the channels of communication among them. The challenge is to establish a framework for engagement that poses a reduced threat to research ethics, eliminates selection bias in the applicant pool for funding, and maintains high scholarly standards. Were these barriers to effective collaboration overcome, it would be exhilarating to contemplate the proliferation of a series of “Minerva Initiatives” in various departments of the executive branch. Wouldn’t government policies and services -- in areas as different as the environmental degradation, foreign aid effectiveness, health care delivery, math and science achievement in secondary schools, and drug policy -- improve dramatically were they able to harness the sharpest minds and cutting-edge research that America’s universities have to offer?

What concrete forms could such university-government collaboration take? There are several immediate steps that could be taken. First, it is important to build on existing robust linkages. The State Department and DoD already have policy planning teams that engage with scholars and academic scholarship. Expanding the budgets as well as scope of these offices could produce immediate benefits.

Second, the departments of the executive branch of the federal government, especially Health and Human Services, Education, Interior, Homeland Security, and Labor, should devise ways of harnessing academic research on the Minerva Initiative model. There must be a clear assessment of where research can lead to the production of more effective policies. Special care must be taken to ensure that the scholarly standards are not adversely compromised.

Third, universities, especially public universities, should incentivize academic engagement with pressing federal initiatives. It is reasonable to envision promotion criteria modified to reward such interaction, whether it takes the form of placements in federal agencies or the production of policy relevant, though still rigorous, scholarship. Fourth, university presidents of all institutions need to renew the perennial debate concerning the purpose of higher education in American public life. Curricula and institutional missions may need to align more closely with national priorities than they do today.

The public’s commitment to scholarship, with its robust tradition of analysis and investigation, must extend well beyond the short-term needs of the economy or exigencies imposed by military entanglements. Academic research and teaching in the humanities, arts and social sciences plays a crucial role in sustaining a culture of open, informed debate that buttresses American democracy. The many-stranded national crisis, however, offers a golden opportunity for broad, meaningful civic engagement by America’s scholars and university teachers. The public benefits of engaging in the policy-making process are, potentially, vast.

Greater university-government cooperation could reaffirm and make visible the public importance of research in the humanities, arts and social sciences.

Not all academic disciplines lend themselves to such public engagement. It is hard to imagine scholars in comparative literature or art history participating with great frequency in such initiatives.

But for those scholars whose work can shed light on and contribute to the solution of massive public conundrums that the nation faces, the opportunity afforded by the election of a new president should not be squandered. Standing aloof is an unaffordable luxury for universities at the moment. The present conjuncture requires enhanced public engagement; the stakes are too high to stand aside.

Author/s: 
Gabriel Paquette
Author's email: 
doug.lederman@insidehighered.com

Gabriel Paquette is a lecturer in the history department at Harvard University.

The Hope of Audacity

I am sick of reading about Malcolm Gladwell’s hair.

Sure, The New Yorker writer has funny hair. It has been big. Very big. It is audacious hair, hair that dares you not to notice it; hair that has been mentioned in far too many reviews. Malcolm Gladwell’s hair is its own thing.

Which is only appropriate, since in his writing, Gladwell has always gone his own way. But he’s been doing it long enough, and so well, and has made so much money, that some folks feel it’s time to trim him down to size. That hair is now seen as uppity.

Gladwell is a mere journalist. He’s not shy, and like many children of academics, he is not intimidated by eggheads. He does none of his own primary research, and instead scours academic journals to find interesting ideas -- he collects experiments and experimenters. He is a translator and a synthesizer, and comes up with catchy, sprightly titled theories to explain what he has seen. Some have called him a parasite. He has called himself a parasite.

It seems to me there’s always been a bit of snarkiness attached to discussions of Gladwell’s work. This is often the case for books that have become commercially successful, which is something that seems particularly to stick in the collective academic craw. There is a weird hostility in the reviews of Gladwell’s books that is directed not at the big-haired guy himself who, like a puppy, nips at the heels of academics and then relishes the opportunity to render their work into fluid, transparent prose, but toward those many people who have made Gladwell famous: his readers. No one matches the caustic condescension of Richard Posner, who said, in a review of Gladwell’s Blink, that “it’s a book for people who don’t read books.”

The reviews of Outliers, Gladwell’s latest book, show that even a New Yorker writer can go too far. People are now attacking Malcolm Gladwell as a kind of brand. The critiques boil down to a few things, one of which is that he doesn’t take into account evidence that refutes his theories. In other words, he’s not doing careful scholarship. But we all know that even careful scholarship is a game of picking and choosing -- it just includes more footnotes acknowledging this. And Gladwell never pretends to be doing scholarship.

Gladwell is also accused of being too entertaining. He takes creaky academic work and breathes Frankensteinian life into it. He weaves anecdotes together, creating a tapestry that builds to an argument that seems convincing. This, some reviewers have claimed, is like perpetuating fraud on the (non-academic) reading public: because Gladwell makes it so much fun to follow him on his intellectual journey, he’s going to convince people of things that aren’t provably, academically true. He will lull the hoi polloi into thinking they’re reading something serious.

Which is, of course, the most common complaint about Gladwell: He’s not serious enough. He’s having too much fun playing with his ideas. And, really, you can’t be Serious when you’re raking in so much coin. Anyone who gets paid four million bucks for a book that mines academic work -- and not necessarily the stuff that is agreed to be Important -- is going to become a target. His speaking fees are beyond the budgets of most colleges. In this way, his career is now similar to that of David Sedaris, who can command an impressive audience and still be dissed by the literary folks. Everyone who’s anyone knows that you can’t sell a lot of books and be a serious writer. Just ask Jonathan Franzen. Or Toni Morrison.

I don’t see Gladwell as a social scientist-manqué, or a philosopher wannabe. Instead, I read him more like an essayist. I think of his books as well-written, research-packed, extended essays. Let me show you the evils of imperialism by telling you a story about the time in Burma when I was forced to shoot an elephant. Let’s look at this (bad) academic prose and think about the relationship between politics and the English language. But instead of using his own experiences, he builds on work done by others. He uses a wry, quirky approach and blithely ignores the received wisdom and pieties of academe. He doesn’t seek out the researcher who’s highly regarded within her field; he looks for people who are doing things he finds interesting.

Gladwell reminds me of the kind of student I knew in college, the nerd who takes weird and arcane courses and then rushes from the lecture hall excited about some idea the professor has mentioned in passing and goes straight to the library to pursue it himself. He stays up all night talking about it, and convincing you that even though you were in the same class, and heard the same reference, you have somehow missed something. Maybe not something big, but at least something really, really cool.

Perhaps I have more trust in readers than to believe that they can be so easily bought off by a good story. And I wish that academics, instead of pillorying Gladwell for being good at translating complicated ideas, would study the way he does it and apply some portion of his method to their own work: He makes mini trade books of monographs. Surely this is a lesson worth learning. He uses the narrative art of the magazine writer to animate ideas. He profiles theories the way Gay Talese or Joan Didion did celebrities.

The audacity Gladwell shows in his writing, connecting seemingly disparate things and working hard, yet with apparent effortlessness, to make the ideas engaging, gives me hope for the future of books. It makes me feel better to see folks buying Gladwell rather than the swimmer Michael Phelps’s memoir or vampire novels -- not that there’s anything wrong with that. Yet this same audacity is what gets Gladwell into hot water with academics. He’s not supposed to do this.

Unless you are an aged physicist, you don’t really get to write books that “purport to explain the world.” You can, of course, try to explicate tiny portions of it. Science writers like James Gleick and Jonathan Weiner can go a lot further than most scientists in terms of making arcane principles understandable to the Joe the Plumbers of the reading world and no one gets bent of out shape. Perhaps it’s because of the assumption that scientists, with a few notable (often British) exceptions, are not supposed to be able to write books that normal people can read. Social scientists and historians are, however, expected to be able to know what is interesting and important about their work and present it to the public. Brand name thinkers like Susan Sontag and Martha Nussbaum can take on big ideas. But these people are experts; journalists shouldn’t try this at home.

What I love about Gladwell is that his writing is like his hair. You can see it as arrogant or scary (he writes about being stopped more frequently by cops when he had a big afro), or you can see it as playful and audacious. This is why, of course, so many reviews mention it; he has the right hair for his work.

One final, dour complaint about Gladwell has to do with his relentless cheeriness. He thinks that people are basically good, though he understands that sometimes circumstances aren’t. I can’t abide high-brow literary novelists who trash fiction that “cops out” with a happy ending. Maybe I’m hopelessly low-brow: I still love Jane Austen and Shakespeare’s comedies. The academic response to most things is generally: it’s more complicated than that. And sure, much of the time it is. But if something’s artfully crafted, I’m willing to cut the author some slack. I don’t ever expect to be thoroughly persuaded of anything; I’m characterologically skeptical and like to do the thinking on my own. Gladwell’s books invite me into a conversation. I think that’s part of the job of a good book.

For me, reading Malcolm Gladwell’s books is like watching Frank Capra movies. Just because they make you feel good and keep you entertained doesn’t mean that they’re not doing valuable work or tackling hard and real issues and ideas. Sure, someone else could have handled it differently. George Bailey might have finally committed suicide; the bank in Bedford Falls could have asked for a government bailout. But right now, maybe it’s not such a bad thing to read books that are a little more hopeful. And yes, audacious.

Author/s: 
Rachel Toor
Author's email: 
newsroom@insidehighered.com

Rachel Toor teaches in the MFA program at Eastern Washington University. She writes a monthly column for The Chronicle of Higher Education, and her most recent book is Personal Record: A Love Affair With Running. Her Web site is www.racheltoor.com.

The Why and How of Human Terrain Teams

Inside Higher Edrecently published an interview with Roberto González, an associate professor of anthropology at San Jose State University, on the Human Terrain System (HTS), a U.S. Army program in which social scientists are embedded with military units. The questions were thoughtful and well asked, but the answers bear little resemblance to the work I conducted as a field social scientist deployed by HTS. I would like to explain what the goals of the program are, what we do, and why we do it, as well as try to clarify misperceptions that arise from unfamiliarity with military culture, terminology, planning and practice.

My job in Iraq was to represent the population to promote nonlethal planning and operations. When a mission is conceptualized, when course of action recommendations have to be made, when decisive points are identified for the commander, my job is to present what the population wants and expects, how it will react, and at all times promote nonlethal options.

This last portion, the promotion of nonlethal options, is of exceeding importance for two reasons. The first is the nature of my mission, and the overall mission of the HTS – we have an ethical responsibility to bring quality socio-cultural information and nonlethal possibilities to the commander’s attention. This is related to the second imperative, which goes to the heart of Counterinsurgency (COIN) doctrine. The three most important elements of COIN are 1) to empower the lowest level (the population), 2) to work from the bottom up (the population) and 3) nonlethal operations accomplish more than lethal ones. In a nutshell, my job is to keep the population, the effects of military operations on the population, and nonlethal options front and center in the commander and command staff’s awareness.

There are a number of ways that an HTT can keep the population and nonlethal options on the front burner. In the case of my team, we used very standard research and analysis methods to get at both primary and secondary open source data. At all times we endeavored to engage in best practices, both in terms of methodology and ethics. We essentially used four basic methods of collection: archival, process observation, participant observation, and semi-structured elite level interviews.

Our archival research had three different purposes. The first was to do our homework about our brigade’s operating environment before we deployed with them to Iraq. The second was to then go through the information on the population already archived by the brigade that we were replacing. The final component was to keep abreast of political, social, religious, and economic events in our operating environment, Iraq, the Middle East, and in some cases, the U.S., which could affect the host nation population that we, and the Army, had to interact with on a daily basis. We also process and participant observed a wide variety of meetings and events. At all of these we identified ourselves fully, explained who we were, what we were doing (serving as socio-cultural advisors for the Army), and asked for permission to ask questions and to attribute or not. At all times we used standard, basic protocols for conducting process and participant observation.

When conducting our elite level interviews, part of a four-month-long tribal study and history, we used formal, documented informed consent. The documents were prepared in English, translated into Arabic, and the interview subject retained one copy and I, as research director, retained one. When requested, anonymity was granted. The Army personnel we worked with never had access to these, to the internal ethical review process of the team, or to the raw information of someone’s identity when anonymity was requested. In fact, because of the social science backgrounds of many of the officers we dealt with daily, they not only understood the protocols, but respected them. Moreover, on one occasion the protocols actually allowed me to provide necessary information to a battalion commander. The sheikh I had just interviewed had consented to my attributing his information, which allowed me to answer the commander’s questions without feeling like I was boxed in. Ethical and methodological best practices actually enabled me to properly do my job. On another occasion, information that I collected was useful in helping the battalion commander, as I provided information that presented a set of nonlethal options for resolving a problem regarding a local mosque.

The results of this four-month study, in combination with data acquired from engaging in participant observation with everyday Iraqis, as well as internally displaced persons, provide very important insight and findings regarding Iraqi tribal behavior, Iraqi politics, religion, rule of law, as well as the stabilization and reconstruction that is being undertaken. The results are being prepared for peer review and publication.

The information we obtained was also packaged and provided to our brigade, the battalions, maneuver companies, as well as the embedded Provincial Reconstruction Team and the U.S. Department of State/U.S. Embassy. Had this information been available when Operation Iraqi Freedom was conceptualized, there would have been a greater chance of the initial stabilization and reconstruction being done in a better informed, more productive, and less lethal manner.

One of the other important points raised by Dr. González – and which I would like everyone to understand -- has to do with Army terminology. I went out on patrol as often as I could. Going on patrol means going out with a combat element, but it does not automatically mean going out to engage in combat or lethal operations. I went out on every mission I could that involved taking humanitarian assistance to the local Iraqis. And here’s the thing to remember – most of these involved going door to door. That’s right: The Army sends soldiers to towns, villages, and settlements to go door to door to deliver food, water, water purifiers, dental prophylaxis, toys, and other items on a regular basis. I also accompanied Civil Affairs teams to conduct assessments of infrastructure, attend meetings, and engage in medical operations among the local population.

In fact, while out on patrol my teammates and I were able to identify several archaeological sites. We brought this to the attention of brigade and battalion staffs, as well as the Cultural Heritage Officer at the U.S. Embassy and the head of the U.S. Army’s Archaeological Unit. We were able to preserve one site that was slated for development. And through collaboration with archaeologists at Penn State, University of Chicago, Harvard, the Army, and State Department, we created a comprehensive list and maps of all the sites in our operating environment so that the Army would know where construction could and could not take place.

The hallmark of good human terrain fieldwork lies in the reduction in the number of lethal operations, casualties inflicted and received. By doing our research, both primary and secondary, we were able to directly or indirectly conceptualize and influence virtually all of our brigade’s problem sets and provide nonlethal options to resolve them. My teammates and I were heavily involved with helping to write the brigade’s campaign plan. Every session always began with the Plans Officer and/or the Line of Effort (LOE) Chief asking what “does right look like for the Iraqis in our OE [operating environment] and how do we get them there?” Our job was to answer that question by taking our research and packaging it in a way that military personnel could easily and quickly digest. When we did this, we were able to ensure that the Army focused on the three most important aspects of COIN that I outlined above. This all translates into fewer injured or killed locals and, of course, fewer injured or killed American and Coalition Forces.

We do not do targeting, intelligence collection, or engage in any part of lethal and kinetic operations, although we do, like everyone, retain the right to self-defense. Contrary to the program’s most vocal critics, we are not using social science methodology to enable the Army to kill more Iraqis and Afghanis. In fact, one of our biggest successes was getting the Shriner’s Hospital in Boston, as well as a local Boston charity, to agree to treat a burned Iraqi boy and house and feed his family pro bono. When our Commander decided it was better for Iraqis to treat him we worked with a sister team in another OE to facilitate his access to treatment within the Iraqi Ministry of Health system.

This goes right to another point on terminology: The Army calls everything they engage in “targeting.” For instance, when the Commander goes to have dinner with a sheikh, that is referred to as targeting. This can easily lead to confusion by those who do not work with the military, so we have been encouraging them to use the terms “engage” and “engagement” instead of “target” and “targeting” when engaging in nonlethal operations. This is, actually, more than just a matter of semantics. By changing the way the military talks about nonlethal operations, we change the way they think about them, which further promotes nonlethal options.

In a nutshell, we are using our methodological skills to help the Army learn how to achieve their goals without having to use force. As someone with extensive methods training, in five different disciplines, and who has taught research methods, I can think of no more noble use than to use these skills to preserve life whenever possible. How many research and teaching academics can say the same about how they use their skills?

There is one set of related items that Dr. González mentions in his interview answers that I would like to address here. Despite what some personnel from the Foreign Military Studies Office wrote, we are not a “CORDs for the twenty-first century.” CORDs, a Vietnam-era initiative, was a full-fledged counterinsurgency program, utilizing both military and civilian advisors who lived with the local populations that they were working with and trained them on all aspects of government and governance. Moreover, they were training these populations in regards to stabilization and reconstruction. Importantly, because the CORDs personnel actually lived among the host nation population, they lived and died with them, so, when necessary, they fought with them. Human Terrain personnel do not live with the host nation population, nor do we fight with them. Rather we live on the military bases, go out with a military security escort, and return home to base after our engagements. We also are not involved with training the population, and we do not engage in stabilization or reconstruction projects. We are enabling advisors, not actors. The Provincial Reconstruction Teams, which are a State Department initiative, are the closest thing we have today to CORDs. The article that Dr. González mentions was published in the September/October 2006 issue of Military Review. As the first HTT did not deploy until February 2007, it was prepared well in advance of HTS becoming operational, and therefore cannot be construed as an accurate representation of HTS or its mission.

Project Phoenix, a separate Vietnam-era program, which too often is confused with, or mistakenly rolled into CORDs, is also not an applicable historical analog to HTS. This was a program advised by the Central Intelligence Agency and it largely involved Vietnamese trying to root out VietCong political cadres with the help of a small number of civilian advisors – mostly law enforcement personnel, not researchers. Unlike Project Phoenix, HTS is not engaged in identification and neutralization of targets.

I also want to make it very clear: The U.S. Army’s Human Terrain System is not connected or affiliated with other programs that have adopted the terminology of human terrain. This is important as Dr. González conflates HTS with these other initiatives and as such it is both inaccurate to confuse them, as well as unfair to HTS to try to paint us with the same brush.

While it is absolutely right to be concerned about learning the lessons of the past, the simple truth is I have yet to see or experience any evidence of the neo-colonial counterinsurgency that Dr. González describes. Regardless of whether you supported the politics and/or policies that led us into our current conflicts, as Americans we have a moral responsibility to leave Iraq and Afghanistan in as functional and stable a state of existence as possible.

Regardless of your politics regarding the war, if one has the skills and knowledge to help out, even a little bit, and one chooses not to, what does that say about that individual or organization? This is the question that the many academics who have found it easy to criticize the Human Terrain System, either from ignorance, misinformation, or political opposition to the policy decisions that led us into the war in Iraq, need to ask themselves.

Author/s: 
Adam L. Silverman
Author's email: 
info@insidehighered.com

Adam L. Silverman holds a doctorate in political science and criminology, masters' degrees in religion and international relations, and a bachelor's in Middle Eastern studies. He was the 2nd Brigade Combat Team/1st Armored Division field social scientist and socio-cultural advisor assigned to HTT IZ6 and is currently a social science advisor with the Human Terrain System. The ideas and opinions expressed in this essay are his alone and do not necessarily reflect the opinions of the brigade, division, U.S. Army, or the Human Terrain System.

Kass Backwards

Last week Leon Kass, chairman of the Council of Bioethics under President Bush, took to the podium to deliver the Jefferson Lecture of the National Endowment for the Humanities -- an event I did not go to, though it was covered by one of IHE's intrepid reporters.

My reluctance to attend suggests that, without noticing it, I have come to accept Kass’s best-known idea, “the wisdom of repugnance.” There is, alas, all too little evidence I am getting any wiser with age -- but my visceral aversion to hearing a Bush appointee talk about human values is inarguable.

As you may recall, Kass wrote in the late 1990s that biotechnological developments such as cloning are “the emotional expression of deep wisdom, beyond reason’s power fully to articulate it.” In our rising gorge, he insisted, “we intuit and feel, immediately and without argument, the violation of things that we rightfully hold dear.... Shallow are the souls that have forgotten how to shudder.”

Judged simply as an argument, this is not, let’s say, apodictically persuasive. Anyone who as ever taken an introductory anthropology course, or read Herodotus -- or gone to a different part of town -- will have learned that different groups feel disgust at different things. The affect seems to be hard-wired into us, but the occasions provoking it are varied.

Kass invoked the "wisdom of repugnance" a few years before he joined an administration that treated the willingness to torture as a great moral virtue -- meanwhile coddling bigots for whom rage at gay marriage was an appropriate response to “the violation of things we hold rightfully dear.”

Now, as it happens, some of us do indeed feel disgust at one of these practices, and not at the other. We also suspect that Kass’s aphorism about the shallowness of souls that have forgotten how to shudder would make a splendid epigraph for the chapter in American history that has just closed.

In short, disgust is not quite so unambiguous and inarguable an expression of timeless values as its champion on the faculty of the University of Chicago has advertised. Given a choice between “deep wisdom” and “reason’s power fully to articulate,” we might do best to leave the ineffable to Oprah.

There is no serious alternative to remaining within the limits of reason. Which means argument, and indeed the valuing of argument -- however frustrating and inconclusive -- because even determining what the limits of reason themselves are tends to be very difficult.

Welcome to modernity. It’s like this pretty much all the time.

The account of Kass's speech in IHE -- and the text of it, also available online -- confirmed something that I would have been willing to wager my paycheck on, had there been a compulsive gambler around to take the bet. For I felt certain that Kass would claim, at some point, that the humanities are in bad shape because nobody reads the “great works” because everybody is too busy with the “deconstruction.”

It often seems like the culture wars are, in themselves, a particularly brainless form of mass culture. Some video game, perhaps, in which players keep shooting at the same zombies over and over, because they never change and just keep coming -- which is really good practice in case you ever have to shoot at zombies in real life, but otherwise is not particularly good exercise.

The reality is that you encounter actual deconstructionists nowadays only slightly more often than zombies. People who keep going on about them sound (to vary references a bit) like Grandpa Simpson ranting about the Beatles. Reading The New Criterion, you'd think that Derrida was still giving sold-out concerts at Che Stadium. Sadly, no.

But then it never makes any difference to point out that the center of gravity for argumentation has shifted quite a lot over the past 25 years. What matters is not actually knowing anything about the humanities in particular -- just that you dislike them in general.

The logic runs something like: “What I hate about the humanities is deconstructionism, because I have decided that everything I dislike should be called ‘deconstructionism.’ ” Q.E.D.!

Kass complained that people in the humanities fail to discuss the true, the good, and the beautiful; or the relationships between humanity, nature, and the divine; or the danger that comes from assuming that technical progress implies the growth of moral and civic virtue. Clearly this is a man who has not stopped at the new books shelf in a library since the elder George Bush was Vice President.

And so last week’s Jefferson lecture was, perhaps, an encouraging moment, in spite of everything. With it, Leon Kass was saying farewell to Washington for, with any luck, a good long while. Maybe now he can spend some time catching up with the range of work people in the humanities have actually been doing. At very least he could read some Martha Nussbaum.

Then he might even pause to reflect on his own role as hired philosopher for an administration that revived one of the interrogation techniques of the Khmer Rouge. The wisdom of repugnance begins at home.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Anthropology
Back to Top