Ethnic / cultural / gender studies

New gay studies book grew out of an especially controversial college class

Smart Title: 

U. of Michigan professor caused major controversy with course called 'How to Be Gay.' A dozen years later, he has written a book on the same topic.

Review of Howard Ball, "At Liberty to Die: The Battle for Death with Dignity in America"

Of the many strange things in Gulliver’s Travels that make it hard to believe anyone ever considered it a children’s book, the most disturbing must be the Struldbruggs, living in the far eastern kingdom of Luggnagg, not covered by Google Maps at the present time.

Gulliver’s hosts among the Luggnaggian aristocracy tell him that a baby is born among them, every so often, with a red dot on the forehead -- the sign that he or she is a Struldbrugg, meaning an immortal. Our narrator is suitably amazed. The Struldbruggs, he thinks, have won the cosmic lottery. Being “born exempt from that universal Calamity of human Nature,” they “have their Minds free and disengaged, without the Weight and Depression of Spirits caused by the continual Apprehension of Death.”

The traveler has no trouble imagining the life he might lead as an immortal, given the chance. First of all, Gulliver tells his audience at dinner, he would spend a couple of hundred years accumulating the largest fortune in the land. He’d also be sure to master all of the arts and sciences, presumably in his spare time. And then, with all of that out of the way, Gulliver could lead the life of a philanthropic sage, dispensing riches and wisdom to generation after generation. (A psychoanalytic writer somewhere uses the expression “fantasies of the empowered self,” which just about covers it.)

But then the Lubnaggians bring him back to reality by explaining that eternal life is not the same thing as eternal youth. The Struldbruggs “commonly acted like Mortals, till about thirty Years old,” one of Gulliver’s hosts explains, “after which by degrees they grew melancholy and dejected, increasing in both till they came to four-score." The expression “midlife crisis” is not quite the one we want here, but close enough.

From the age of eighty on, “they had not only all the Follies and Infirmities of other old Men, but many more which arose from the dreadful Prospect of never dying.” Forget the mellow ripening of wisdom: Struldbruggs “were not only Opinionative, Peevish, Covetous, Morose, Vain, Talkative, but incapable of Friendship, and dead to all natural Affection.”

It gets worse. Their hair and teeth fall out. “The Diseases they were subject to still continuing without increasing or diminishing,” Gulliver tells us. “In talking they forgot the common Appellation of Things, and the Names of Persons, even of those who are their nearest Friends and Relations. For the same Reason they never can amuse themselves with reading, because their Memory will not serve to carry them from the beginning of a Sentence to the end; and by this Defect they are deprived of the only entertainment whereof they might otherwise be capable.”

It is a vision of hell. Either that, or a prophecy of things to come, assuming the trends of the last few decades continue. Between 1900 and 2000, the average life expectancy in the United States rose from 49 to 77 years; between 1997 and 2007, it grew by 1.4 years. This is not immortality, but it beats dying before you reach 50. The span of active life has extended as well. The boundary markers of what counts as old age keep moving out.

From a naïve, Gulliverian perspective, it is all to the good. But there’s no way to quantify changes in the quality of life. We live longer, but it's taking longer to die as well. Two-thirds of deaths among people over the age of 65 in the United States are caused by three chronic conditions: heart disease, cancer, and stroke. The “life” of someone in a persistent vegetative state (in which damage to the cerebral cortex is so severe and irreversible that cognitive functions are gone for good) can be prolonged indefinitely, if not forever.

More horrific to imagine is the twilight state of being almost vegetative, but not quite, with some spark of consciousness flickering in and out -- a condition of Struldbruggian helplessness and decay. “I grew heartily ashamed of the pleasing Visions I had formed,” says Gulliver, “and thought no Tyrant could invent a Death into which I would not run with Pleasure from such a Life.”

Howard Ball’s book At Liberty to Die: The Battle for Death With Dignity in America (New York University Press) is a work of advocacy, as the subtitle indicates. The reader will find not the slightest trace of Swiftian irony in it. Ball, a professor emeritus of political science at the University of Vermont, is very straightforward about expressing bitterness -- directing it at forces that would deny “strong-willed, competent, and dying adults who want to die with dignity when faced with a terminal illness” their right to do so.  

The forces in question fall under three broad headings. One is the religious right, which Ball sees as being led, on this issue at least, by the Roman Catholic Church. Another is the Republican Party leadership, particularly in Congress, which he treats as consciously “politicizing the right-to-die issue” in a cynical manner, as exemplified by the memo of a G.O.P. operative on “the political advantage to Republicans [of] intervening in the case of Terri Schiavo.” (For anyone lucky enough to have forgotten: In 1998, after Terri Shiavo had been in a persistent vegetative state for eight years, her husband sought to have her feeding tube removed, setting off numerous rounds of litigation, as well as several pieces of legislation that included bills in the US Congress. The feeding tube was taken out and then reattached twice before being finally removed in 2005, after which Schiavo died. The website of the University of Miami's ethics program has a detailed timeline of the Schiavo case.)

The third force Ball identifies is that proverbial 800-pound gorilla known as the Supreme Court of the United States. Its rulings in Washington v. Glucksburg and Vacco v. Quill in 1997 denied the existence of anything like a constitutionally protected right to physician-assisted death (PAD). States can outlaw PAD -- or permit it, as Montana, Oregon, and Washington do at present. In the epigraph to his final chapter, Ball quotes a Colorado activist named Barbara Coombs Lee: “We think the citizens of all fifty states deserve death with dignity.” But the Supreme Court of the United States will not be making that a priority any time soon.

“The central thesis of the book,” states Ball, “is that the liberty found in the U.S. Constitution’s Fifth and Fourteenth ‘Due Process’ Amendments extends... [to] the terminally ill person's right to choose to die with dignity -- with the passive assistance of a physician -- rather than live in great pain or live a quality-less life.” The typical mode of “passive assistance” would be “to give painkilling medications to a terminally ill patient, with the possibility that the treatment will indirectly hasten the patient’s death.”

Ball notes that a Pew Research Center Survey from 2005 showed that an impressive 84 percent level of respondents “approved of patients being able to decide whether or not to be kept alive through medical treatment or choosing to die with dignity.”

Now, for whatever it’s worth, that solid majority of 84 percent includes this columnist. If the time for it ever comes, I’d want my doctor to supersize me on the morphine drip without breaking any laws. Throughout Ball's narrative of the successes and setbacks of the death-with-dignity cause, I cheered at each step forward, and felt appalled all over again while reading the chapter he calls “Terri Schiavo’s Tragic Odyssey,” although it did seem like the more suitable adjective would be “grotesque.” Tragedy implies at least some level of dignity.

The author also introduced me to a blackly humorous expression, “death tourist,” which refers to a person "visiting" a state to take advantage of physician-assisted suicide being legal there.

As a member of the choir, I liked Ball's preaching, but it felt like the sermon was missing an index card or two. As mentioned earlier, the book’s “central thesis” is supposed to be that the due-process guarantees in Constitution extend to the right to death with dignity. And so the reader has every reason to expect a sustained and careful argument for why that legal standard applies. None is forthcoming. The due-process clauses did come up when the Supreme Court heard oral arguments in 1997, but it rejected them as inapplicable. This would seem to be the point in the story where that central thesis would come out swinging. The author would show, clearly and sharply, why the Court was wrong to do so. He doesn't. It is puzzling.

Again, it sounds very categorical when Ball cites that Pew survey from 2005 showing 84 percent agreement that individuals had a right to choose an exit strategy if medical care were not giving them a life they felt worth living. But the same survey results show that when asked whether they believed it should be legal for doctors to "assist terminally ill patients in committing suicide," only 44 percent favored it, while 48 percent were opposed. With the matter phrased differently -- surveyors asking if it should be legal for doctors to "give terminally ill patients the means to end their lives" -- support went up to 51 percent, while 40 percent remained opposed. This reveals considerably more ambivalence than the 84 percent figure would suggest.

The notion that a slippery slope will lead from death with dignity to mass programs of euthanasia clearly exasperates Ball, and he can hardly be faulted on that score. A portion of the adult population is prepared to believe that any given social change will cause the second coming of the Third Reich, this time on American soil. (Those who do not forget the History Channel are condemned to repeat fairly dumb analogies.) But the slippery-slope argument will more likely be refuted in practice than through argument. Whether or not the law recognizes it, the right to make decisions about one’s own mortality or quality of life will exist any time someone claims it. One of the medical profession’s worst-kept secrets for some time now is that plenty of physicians will oblige a suffering patient with the means to end their struggle. (As Ball notes, this came up in the Supreme Court discussions 15 years ago.)

And the demand is bound to grow, as more and more of us live long enough to see -- like Gulliver -- that there are worse fates than death. Brilliant legal minds should apply themselves to figuring out how to make an ironclad case for the right a decent departure from this mortal coil. At Liberty to Die is useful as a survey of some obstacles standing in the way. But in the meantime, people will find ways around those obstacles, even if it means taking a one-way, cross-continental trip to the Pacific Northwest. There are worse ways to go.

New minor in gay studies faces political attacks in Louisiana

Section: 
Smart Title: 

University of Louisiana at Lafayette faces backlash over the first minor of its kind in the state.

Georgian Court plan to admit men points to rate at which women's colleges are going coed

Smart Title: 

Usually when a women's college says it's going coed, it omits the elephant in the room: that it's just one more in a long line to do so. Not Georgian Court.

Putting the black studies debate into perspective (essay)

Intellectual Affairs

For a week now, friends have been sending me links from a heated exchange over the status and value of black studies. It started among bloggers, then spilled over into Twitter, which always makes things better. I'm not going to rehash the debate, which, after all, is always the same. As with any other field, black studies (or African-American studies, or, in the most cosmopolitan variant, Africana studies) could only benefit from serious, tough-minded, and ruthlessly intelligent critique. I would be glad to live to see that happen.

But maybe the rancor will create some new readers for a book published five years ago, From Black Power to Black Studies: How a Radical Social Movement Became an Academic Discipline (Johns Hopkins University Press) by Fabio Rojas, an associate professor of sociology at Indiana University. Someone glancing at the cover in a bookstore might take the subtitle to mean it's another one of those denunciations of academia as a vast liberal-fascist indoctrination camp for recruits to the New World Order Gestapo. I don't know whether that was the sales department's idea; if so, it was worth a shot. Anyway, there the resemblance ends. Rojas wrote an intelligent, informed treatment of black studies, looking at it through the lens of sociological analysis of organizational development, and with luck the anti-black-studies diatribalists will read it by mistake and accidentally learn something about the field they are so keen to destroy. (Spell-check insists that “diatribalists” is not a word, but it ought to be.)

Black studies was undeniably a product of radical activism in the late 1960s and early ‘70s. Administrators established courses only as a concession to student protesters who had a strongly politicized notion of the field’s purpose. “From 1969 to 1974,” Rojas writes, “approximately 120 degree programs were created,” along with “dozens of other black studies units, such as research centers and nondegree programs,” plus professional organizations and journals devoted to the field.

But to regard black studies as a matter of academe becoming politicized (as though the earlier state of comprehensive neglect wasn’t politicized) misses the other side of the process: “The growth of black studies,” Rojas suggests, “can be fruitfully viewed as a bureaucratic response to a social movement.” By the late 1970s, the African-American sociologist St. Clair Drake (co-author of Black Metropolis, a classic study of Chicago to which Richard Wright contributed an introduction) was writing that black studies had become institutionalized “in the sense that it had moved from the conflict phase into adjustment to the existing educational system, with some of its values accepted by that system…. A trade-off was involved. Black studies became depoliticized and deradicalized.”

That, too, is something of an overstatement -- but it is far closer to the truth than denunciations of black-studies programs, which treat them as politically volatile, yet also as well-entrenched bastions of power and privilege. As of 2007, only about 9 percent of four-year colleges and universities had a black studies unit, few of them with a graduate program. Rojas estimates that “the average black studies program employs only seven professors, many of whom are courtesy or joint appointments with limited involvement in the program” -- while in some cases a program is run by “a single professor who organizes cross-listed courses taught by professors with appointments in other departments.”

The field “has extremely porous boundaries,” with scholars who have been trained in fields “from history to religious studies to food science.” Rojas found from a survey that 88 percent of black studies instructors had doctoral degrees. Those who didn’t “are often writers, artists, and musicians who have secured a position teaching their art within a department of black studies.”

As for faculty working primarily or exclusively in black studies, Rojas writes that “the entire population of tenured and tenure-track black studies professors -- 855 individuals -- is smaller than the full-time faculty of my own institution.” In short, black studies is both a small part of higher education in the United States and a field connected by countless threads to other forms of scholarship. The impetus for its creation came from African-American social and political movements. But its continued existence and development has meant adaptation to, and hybridization with, modes of enquiry from long-established disciplines.

Such interdisciplinary research and teaching is necessary and justified because (what I am about to say will be very bold and very controversial, and you may wish to sit down before reading further) it is impossible to understand American life, or modernity itself, without a deep engagement with African-American history, music, literature, institutions, folklore, political movements, etc.

In a nice bit of paradox, that is why C.L.R. James was so dubious about black studies when it began in the 1960s. As author of The Black Jacobins and The History of Negro Revolt, among other classic works, he was one of the figures students wanted to be made visiting professor when they demanded black studies courses. But when he accepted, it was only with ambivalence. "I do not believe that there is any such thing as black studies," he told an audience in 1969. "...I only know, the struggle of people against tyranny and oppression in a certain social setting, and, particularly, the last two hundred years. It's impossible for me to separate black studies and white studies in any theoretical point of view."

Clearly James's perspective has nothing in common with the usual denunciations of the field. The notion that black studies is just some kind of reverse-racist victimology, rigged up to provide employment for "kill whitey" demagogues, is the product of malice. But it also expresses a certain banality of mind -- not an inability to learn, but a refusal to do so. For some people, pride in knowing nothing about a subject will always suffice as proof that it must be worthless.

Essay on invisibility incivility in online communications

Category: 

With Maria Shine Stewart out of pocket, a guest author offers an alternative point of view on kindness. Me first!

Section: 

Review of Orin Starn, "The Passion of Tiger Woods"

Intellectual Affairs

On the Friday following Thanksgiving in 2009, Tiger Woods had an automobile accident. For someone who does not follow golf, the headlines that ran that weekend provided exactly as much information as it seemed necessary to have. Over the following week, I noticed a few more headlines, but they made no impression. Some part of the brain is charged with the task of filtering the torrent of signals that bombard it from the media every day. And it did its job with reasonable efficiency, at least for a while.

Some sort of frenzy was underway. It became impossible to tune this out entirely. I began to ignore it in a more deliberate way. (All due respect to the man for his talent and accomplishments, but the doings of Tiger Woods were exactly as interesting to me as mine would be to him.) There should be a word for the effort to avoid giving any attention to some kerfuffle underway in the media environment. “Fortified indifference,” perhaps. It’s like gritting your teeth, except with neurons.

But the important thing about my struggle in 2009 is that it failed. Within six weeks of the accident, I had a rough sense of the whole drama in spite of having never read a single article on the scandal, nor watched nor listened to any news broadcasts about it. The jokes, allusions, and analogies spinning off from the event made certain details inescapable. A kind of cultural saturation had occurred. Resistance was futile. The whole experience was irritating, even a little depressing, for it revealed the limits of personal autonomy in the face of an unrelenting media system, capable of imposing utterly meaningless crap on everybody’s attention, one way or another.

But perhaps that’s looking at things the wrong way. Consider the perspective offered by Orin Starn in The Passion of Tiger Woods: An Anthropologist Reports on Golf, Race, and Celebrity Scandal (Duke University Press). Starn, the chair of cultural anthropology at Duke, maintains that the events of two years back were not meaningless at all. If anything, they were supercharged with cultural significance.

The book's title alludes to the theatrical reenactments of Christ’s suffering performed at Easter during the middle ages, or at least to Mel Gibson’s big-screen rendition thereof. Starn interprets “Tigergate” as an early 21st-century version of the scapegoating rituals analyzed by René Girard. From what I recall of Girardian theory, the reconsolidation of social order involves the scapegoat being slaughtered, rather than paying alimony, though in some cases that may be too fine a distinction.

The scandal was certainly louder and more frenetic than the game that Woods seems have been destined to master. The first image of him in the book shows him at the age of two, appearing on "The Mike Douglas Show" with his father. He is dressed in pint-sized golfing garb, with a little bag of clubs over his shoulder. As with a very young Michael Jackson, the performance of cuteness now reads as a bit creepy. Starn does not make the comparison, but it’s implicit, given the outcome. “This toddler was not to be one of those child prodigies who flames out under unbearable expectations,” Starn writes. “By his early thirties, he was a one-man multinational company…. Forbes magazine heralded Woods as the first athlete to earn $1 billion.”

Starn, who mentions that he is a golfer, is also a scholar of the game, which he says “has always traced the fault lines of conflict, hierarchy, and tension in America, among them the archetypal divides of race and class.” To judge by my friend Dave Zirin’s book A People’s History of Sports in the United States (The New Press) that’s true of almost any athletic pursuit, even bowling. But the salient point about Woods is that most of his career has been conducted as if no such fault lines existed. Starn presents some interesting and little-known information on how golf was integrated. But apart from his genius on the green, Woods’s “brand” has been defined by its promise of harmony: “He and his blonde-haired, blue-eyed wife, Elin Nordegren, seemed the poster couple for a shiny new postracial America with their two young children, two dogs, and the fabulous riches of Tiger’s golfing empire.”

Each of his parents had a multiracial background -- black, white, and Native American on his father’s side; Chinese, Thai, and Dutch on his mother’s. “Cablinasian,” the label Woods made up to name his blended identity, is tongue-in-cheek, but it also represents a very American tendency to mess with the established categories of racial identity by creating an ironic mask. (Ralph Ellison wrote about in his essay “Change the Joke and Slip the Yoke.”)

But that mask flew off, so to speak, when his car hit the fire hydrant in late 2009. Starn fills out his chronicle of the scandal that followed with an examination of the conversation and vituperation that took place online, often in the comments sections of news articles -- with numerous representative samples, in all their epithet-spewing, semiliterate glory. The one-drop rule remains in full effect, it seems, even for Cablinasians.

“For all the ostensible variety of opinion,” Stern writes about the cyberchatter, “there was something limited and predictable about the complaints, stereotypes, and arguments and counterarguments, as if we were watching a movie we’d already seen many times before. Whether [coming from] the black woman aggrieved with Tiger about being with white women or the white man bitter about supposed black privilege, we already knew the lines, or at least most of them.… We are all players, like it or not, in a modern American kabuki theater of race, where our masks too often seem to be frozen into a limited set of expressions.”

Same as it ever was, then. But this is where the comparison to a scapegoating ritual falls apart. (Not that it’s developed very much in any case.) At least in Girard’s analysis, the ritual is an effort to channel and purge the conflicts within a society – reducing its tensions, restoring its sense of cohesion and unity, displacing the potential for violence by administering a homeopathic dose of it. Nothing like that can be said to have happened with Tigergate. It involved no catharsis. For that matter, it ran -- by Starn’s own account -- in exactly the opposite direction: the golfer himself symbolized harmony and success and the vision of historical violence transcended with all the sublime perfection of a hole-in-one. The furor of late 2009 negated all of that. The roar was so load that it couldn’t be ignored, even if you plugged your ears and looked away.

The latest headlines indicate that Tiger Woods is going to play the Pebble Beach Pro-Am tournament next month, for the first time in a decade. Meanwhile, his ex-wife has purchased a mansion for $12 million and is going to tear it down. She is doing so because of termites, or so go the reports. Hard to tell what symbolic significance that may have. But under the circumstances, wiping out termites might not be her primary motivation for destroying something incredibly expensive.

Ryan Gosling pick-up line meme reaches academe

Section: 
Smart Title: 

Satirical blogs explore whether a Hollywood sex symbol can make academic pick-up lines seem smooth.

Approach and Avoid

In 1939, the French anthropologist Michel Leiris published a memoir called Manhood in which he undertook an inventory of his own failures, incapacities, physical defects, bad habits, and psychosexual quirks. It is a triumph of abject self-consciousness. And the subtitle, “A Journey from Childhood into the Fierce Order of Virility,” seems to heighten the cruelty of the author’s self-mockery. Leiris portrays himself as a wretched specimen: machismo’s negation.

But in fact the title was not ironic, or at least not merely ironic. It was a claim to victory. “Whoever despises himself, still respects himself as one who despises,” as Nietzsche put it. In an essay Leiris wrote when the book was reissued after World War II, he described it as an effort to turn writing into a sort of bullfight: “To expose certain obsessions of an emotional or sexual nature, to admit publicly to certain deficiencies or dismays was, for the author, the means – crude, no doubt, but which he entrusts to others, hoping to see it improved – of introducing even the shadow of the bull’s horn into a literary work.”

By that standard, Leiris made the most broodingly taciturn character in Hemingway look like a total wuss.

The comment about passing along a technique to others -- “hoping to see it improved” -- now seems cringe-making in its own way. Leiris was addressing a small audience consisting mainly of other writers. The prospect of reality TV, online confessionals, or the industrialized production of memoirs would never have crossed his mind. He hoped his literary method -- a kind of systematic violation of the author's own privacy -- would develop as others experimented with it. Instead, the delivery systems have improved. They form part of the landscape Wayne Koestenbaum surveys in Humiliation, the latest volume in Picador’s Big Ideas/Small Books series.

Koestenbaum, a poet and essayist, is a professor of English at the City University of New York Graduate Center and a visiting professor in the painting department of the Yale School of Art. The book is an assemblage of aphoristic fragments, notes on American popular culture and its cult of celebrity, and reflections on the psychological and social dynamics of humiliation – with a few glances at how writing, or even language itself, can expose the self to disgrace. It’s unsystematic, but in a good way. Just because the author never quotes Erving Goffman or William Ian Miller is no reason to think they aren’t on his mind. “I’m writing this book,” he says early on, “in order to figure out – for my own life’s sake – why humiliation is, for me, an engine, a catalyst, a cautionary tale, a numinous scene, producing sparks and showers…. Any topic, however distressing, can become an intellectual romance. Gradually approach it. Back away. Tentatively return.”

The experience of humiliation is inevitable, short of a life spent in solitary confinement, and I suppose everyone ends up dishing it out as well as taking it, sooner or later. But that does not make the topic universally interesting. The idea of reading (let alone writing) almost two hundred pages on the subject will strike many people as strange or revolting. William James distinguished between “healthy mindedness” (the temperament inclined to “settl[ing] scores with the more evil aspects of the universe by systematically declining to lay them to heart or make much of them…. or even, on occasion, by denying outright that they exist”) and “sick souls” (which “cannot so swiftly throw off the burden of the consciousness of evil, but are congenitally fated to suffer from its presence”). Koestenbaum’s readers are going to come from just one side of that divide.

But then, one of the James’s points is that the sick soul tends to see things more clearly than the robust cluelessness of the healthy-minded ever permits. As a gay writer -- and one who, moreover, was taken to be a girl when he was young, and told that he looked like Woody Allen as an adult -- Koestenbaum has a kind of sonar for detecting plumes of humiliation beneath the surface of ordinary life.

He coins an expression to name “the somberness, or deadness, that appears on the human face when it has ceased to entertain the possibility that another person exists.” He calls it the Jim Crow gaze – the look in the eyes of a lynching party in group photos from the early 20th century, for example. But racial hatred is secondary to “the willingness to desubjectify the other person” – or, as Koestenbaum puts it more sharply, “to treat someone else as garbage.” What makes this gaze especially horrific is that the person wearing it can also be smiling. (The soldier giving her thumbs-up gesture while standing next to naked, hooded prisoners at Abu Ghraib.) The smile “attests to deadness ... you are humiliated by the refusal, evident in the aggressor’s eyes, to see you as sympathetic, to see you as a worthy, equal subject.”

Deliberate and violent degradation is the extreme case. But the dead-eyed look, the smirk of contempt, are common enough to make humiliation a kind of background radiation of everyday social existence, and intensified through digital communication “by virtue of its impersonality…its stealth attack.” An embarrassing moment in private becomes a humiliating experience forever if it goes viral on YouTube.

“The Internet is the highway of humiliation,” Koestenbaum writes. “Its purpose is to humiliate time, to turn information (and the pursuit of information) into humiliation.” This seems overstated, but true. The thought of Google owning everyone’s search histories is deeply unsettling. The sense of privacy may die off completely one day, but for now the mass media, and reality TV most of all, work to document its final twitches of agony. “Many forms of entertainment harbor this ungenerous wish: to humiliate the audience and to humiliate the performer, all of us lowered into the same (supposedly pleasurable) mosh pit.”

A study of humiliation containing no element of confession would be a nerveless book indeed. Koestenbaum is, like Leiris, a brave writer. The autobiographical portions of the book are unflinching, though flinch-inducing. There are certain pages here that, once read, cannot be unread, including one that involves amputee porn. No disrespect to amputees intended, and the human capacity to eroticize is probably boundless; but Koesternbaum's describes a practice it never would have occurred to me as being possible. With hindsight, I was completely O.K. with that, but it's too late to get the image out of my head now.

Humiliation counts on “shame’s power to undo boundaries between individuals,” which is also something creativity does. That phrase comes from Koestenbaum tribute to the late Eve Kosofsky Sedgwick towards the end of the book. He evokes the memory of her friendship at least as much as the importance of her foundational work in queer theory – though on reflection, I’m not so sure it makes sense to counterpose them. Kosofsky’s ideas permeate the book; she was, like Koestenbaum, also a poet; and Humiliation may owe something to A Dialogue on Love, the most intimate of her writings.

But it’s more reckless and disturbing, because the author plays off of his audience's own recollections of humiliation, and even with the reader's capacity for disgust. There’s a kind of crazy grace to Koestenbaum’s writing. He moves like a matador working the bull into ever greater rage -- then stepping out of the path of danger in the shortest possible distance, at the last possible moment, with a flourish.

Author/s: 
Scott McLemee
Author's email: 
scott.mclemee@insidehighered.com

Pages

Subscribe to RSS - Ethnic / cultural / gender studies
Back to Top