books

Review of Sara L. Crosby, "Poisonous Muse: The Female Poisoner and the Framing of Popular Authorship in Jacksonian America"

The mythological creature called the lamia is something like a hybrid of mermaid and vampire: a beautiful woman from the waist up, atop a serpent’s body, driven by an unwholesome appetite. The indispensable Brewer’s Dictionary of Phrase and Fable elaborates: “A female phantom, whose name was used by the Greeks and Romans as a bugbear to children. She was a Libyan queen beloved by Jupiter, but robbed of her offspring by the jealous Juno; and in consequence she vowed vengeance against all children, whom she delighted to entice and murder.”

Somewhere along the way, the Libyan queen’s proper name turned into the generic term for a whole subspecies of carnivorous nymph. Nor was the menu limited to children. In some tellings, the lamia could conceal her snaky side long enough to lure unwary human males to their deaths. (A femme fatale, if ever.) If the lamia outlived most of the other gods and monsters of antiquity in the Western cultural imagination, I suspect it is in part because of the coincidence that she embodies two aspects of Eden: the beguiling female and the deceiving reptile, merged, literally, into one.

That this overtly misogynistic image might ever have played a part in the political culture of the United States seems improbable -- a little less so in this election year, perhaps, though it remains difficult to picture. And it’s certainly true that the lamia underwent considerable mutation in crossing the Atlantic and finding a place in American literature and party politics. Sara L. Crosby’s Poisonous Muse: The Female Poisoner and the Framing of Popular Authorship in Jacksonian America (University of Iowa Press) follows the lamia’s transformation from the monster known to a classically educated elite to the sympathetic, vulnerable and all-too-human character accepted by the new mass public of early 19th-century America.

The author, an associate professor of English at Ohio State University at Marion, follows the lamia’s trail from antiquity (in Roman literature “she appeared as a dirty hermaphroditic witch who raped young men”) through the poetry of John Keats and on to such early American page-turners as The Female Land Pirate; or Awful, Mysterious, and Horrible Disclosures of Amanda Bannorris, Wife and Accomplice of Richard Bannorris, a Leader in That Terrible Band of Robbers and Murderers, Known Far and Wide as the Murrell Men. (Sample passage: “My whole nature was changed. All the dark passions of Hell seemed to have centered into one; that one into the core of my heart, and that one was revenge! REVENGE!! REVENGE!!!”) There are close readings of stories by Edgar Allan Poe and Nathaniel Hawthorne as well as of the case of Mrs. Hannah Kinney, alleged poisoner of husbands, acquitted after a trial that riveted the country’s newspaper readers.

From this array Crosby builds an argument in layers that may be synopsized roughly along these lines:

A standard version of the lamia story is presented by the Athenian author Philostratus in The Life of Apollonius of Tyana. A young philosopher named Menippus falls under the charms of “a foreign woman, who was good-looking and extremely dainty,” and to all appearances very wealthy as well. He prepares to marry her. Unfortunately, the older and wiser philosopher Apollonius intervenes in time to set the young man straight: “You are a fine youth and are hunted by fine women, but in this case you are cherishing a serpent, and a serpent cherishes you.” Menippus resists this advice, but Apollonius has a verbal showdown with the foreign lady and forces her to admit that she is a lamia, and places her under his control.

Menippus thus receives instruction on the difference between appearance and reality -- and in time to avoid being eaten. The situation can also be read as a kind of political fable: a wise authority figure intervenes to prevent a naïve young person from succumbing to the deceptive, seductive and destructive powers of a woman. For the figure of the lamia is congruent with a whole tradition of misogynistic attitudes, as expressed by the medieval theologian Albertus Magnus: “What [a woman] cannot get, she seeks to obtain through lying and diabolical deceptions. And so, to put it briefly, one must be on one’s guard with every woman, as if she were a poisonous snake and the horned devil.” (This is only one such passage Crosby cites by an authoritative figure maintaining that authority itself is endangered unless men with power practice epistemological as well as moral vigilance.)

But with his 1820 poem “Lamia,” John Keats offers a revisionist telling of the story. To wed her human lover, Lamia sacrifices both her venom and her immortality. In the Philostratusian telling, the confrontation with Apollonius makes her vanish, and presumably kills her, and her beloved immediately falls dead from grief. Having been savaged by reviewers and dismissed as a “Cockney poet” by the literary establishment, Keats recasts the story as a defense of beauty and a challenge to authority. The older man’s knowledge is faulty and obtuse, his power callous and deadly. Poe was an ardent admirer of Keats, and his critical writings are filled with expressions of contempt for the cultural gatekeepers of his day; Crosby interprets the title character of “Ligeia” (a very strange short story that Poe himself considered his best work) as “a revamped Romantic lamia” akin to Keats’s.

The continuity is much easier to see in the case of "Rappaccini's Daughter" by Nathaniel Hawthorne, in which Beatrice (the title figure) is a lamia-like menace to every living thing that crosses her path. This is through no fault of her own; suffice to say that a man with authority has turned her into a kind of walking biological weapon. Once again, the Philostratusian version of the story has been reconfigured. The misogynist vision of the lamia as a force for deception and destruction is abandoned. Her story becomes a fable of oppression, corruption, the illegitimacy of established authority.

These literary reimaginings took shape against a political backdrop that added another layer of significance to the transformation. In the first half century of the United States, citizens “practiced a ‘politics of assent,’ in which a relatively small population of mostly rural voters bowed to the leadership of local elites,” Crosby writes. Editorials and sermons issued Apollonius-like warnings about the need to subdue desire and cultivate virtue. One widely circulated and much-reprinted story told of a daughter who rapidly went from sassing her parents to poisoning them. Clearly the republic’s young females in particular were on the slipperiest of slopes.

“But by the time Andrew Jackson won the presidential election of 1828,” Crosby continues, “the nation was transitioning to a far more raucous and partisan ‘mass democracy,’ characterized by a ‘politics of affiliation’ in which larger populations of newly enfranchised white ‘common men’ identified with national political organizations.” Those organizations issued newspapers and magazines, to which publishers added an enormous flow of relatively cheap books, pamphlets and broadsides.

The old elites (largely associated with the Whig Party) dismissed most of this output as trash, and they may have had a point, if “revenge! REVENGE!! REVENGE!!!” is anything to go by. At the same time, Poe was arguing that, in Crosby’s words, “genius occurred in that space of free exchange between writer and reader” that could open up if Americans could shed their cultural subservience to the Old World. As for Hawthorne, he was a Democratic Party functionary who idolized Jackson, and "Rappaccini's Daughter" was first published in a Democratic Party magazine.

So the basic thematic implications of the “old” (Philostratusian) and “new” (Keatsian) lamia stories lined up fairly well with Whiggish and Jacksonian-Democratic cultural attitudes, respectively. For one side, the American people needed guidance from Apollonius the Whig to avoid the madness of excessive democracy (let the French revolution be a warning) and the lamia-like seductions of the new mass media. For the Democrat, danger came from corrupt authorities, out to manipulate the citizen into believing the worst about the innocent and moral female sex.

The political allegory took on flesh in the case of a number of women accused of murdering with poison -- an act of deception and homicide of decidedly lamia-like character. The Boston trial of Hannah Kinney -- whose third husband was found to have died from arsenic poisoning -- is both fascinating in itself and a striking corroboration of the author’s point about the lamia as a sort of template for public narrative. Early newspaper reports and gossip depicted her as a bewitching temptress of questionable morals and undoubted guilt. But as the trial continued, Democratic journalists described her as a pleasant, somewhat matronly woman whose late husband was mentally disturbed and who was trying to get over syphilis with the help of a shady “doctor.” (The arsenic in his system might well have gotten there through quackery or suicide.)

The jury acquitted her, which cannot have surprised the junior prosecuting attorney: “Recent experience has shown how difficult, if not impossible it has been to obtain a verdict of condemnation, in cases of alleged murders by secret poison, when females have been the parties accused, and men were the persons murdered.” By contrast, a number of British women accused of poisoning during the same period were dispatched to the gallows with haste. Factors besides "the lamia complex" may account for the difference, but the contrast is striking even so.

It’s unlikely that many Americans in the 1840s had read The Life of Apollonius of Tyana, or heard of Keats, for that matter. Cultural influence need not be that direct to be effective; it can be transmitted on the lower frequencies through repurposed imagery and fragments of narrative, through relays remixes. Perhaps that is what we’re seeing now -- with who knows what archetypes being mashed up on the political stage.

Editorial Tags: 
Image Source: 
University of Iowa Press

Researcher clashes with publisher over book chapter on open education resources

Smart Title: 

Psychology instructor withdraws book chapter after refusing to add language that he asserts a publisher demanded but he deemed too flattering to the textbook industry.

Essay on Barbara Ehrenreich's 'Living With a Wild God'

Examples of atheist spiritual autobiography are not plentiful, although the idea is not as self-contradictory as it perhaps sounds. A quest story that ends without the grail being located or the ring destroyed may not satisfy most audiences, but it's a quest story even so.

The one example that comes to mind is Twelve Years in a Monastery (1897) by Joseph McCabe, an English Franciscan who spent much of his clerical career grappling with doubts that eventually won out. Twelve Years is framed mainly as a critique and expose of clerical life, but its interest as a memoir comes in part from McCabe’s struggle to accept the moral and intellectual demands imposed by his growing skepticism. For all of its defects, monasticism offered a career in which McCabe’s talents were recognized and even, within ascetic limits, rewarded. Leaving it meant answering the call of a new vocation: He went on to write on an encyclopedic array of scientific, cultural and historical topics.

McCabe also became the translator and primary English spokesman for Ernst Haeckel, the German evolutionary theorist and advocate of pantheism, which seems to have squared easily enough with the lapsed monk’s atheism. (There may be more than a semantic difference between thinking of God and the universe as identical and believing there is no God, just universe. But if so, it is largely in the eye of the beholder.)

Barbara Ehrenreich’s background could not be more different from Joseph McCabe’s. In Living With a Wild God: A Nonbeliever’s Search for the Truth About Everything (Hachette/Twelve) she describes her working-class family as consisting of atheists, rationalists, and skeptics for at least a couple of generations back. “God is an extension of human personality,” she wrote in her journal as an adolescent, “brought into the world and enslaved as man’s glorifier.” McCabe would have had to do penance for such a thought; in Ehrenreich’s case, it was just dinner-table wisdom -- expressed with precocious verbal finesse, later honed into a sharp instrument of social criticism put to work in Nickle and Dimed, among other books.

Her memoir appeared two years ago, though I’ve only just read it, making this column more rumination than review. The usual blurb-phrases apply: it’s brilliantly written, thought-provoking, and often very funny, taking aphoristic turns that crystallize complex feelings into the fewest but most apt words. For example: “[I]f you’re not prepared to die when you’re almost 60, then I would say you’ve been falling down on your philosophical responsibilities as a grown-up human being.” Or: “As a child I had learned many things from my mother, like how to sew a buttonhole and scrub a grimy pot, but mostly I had learned that love and its expressions are entirely optional, even between a mother and child.”   

So, a recommended read. (More in the reviewerly vein is to be found here and here.) Additional plaudits for Living With a Wild God won’t count for much at this late date, while its literary ancestry might still be worth a thought. For it seems entirely possible, even likely, that Ehrenreich’s parents and grandparents in Butte, Montana would have read McCabe -- “the trained athlete of disbelief,” as H.G. Wells called him, in recognition of McCabe’s countless books and pamphlets on science, progress and the benefits of godlessness. Many of them appeared in newsprint editions circulating widely in the United States during the first half of the 20th century, with one publisher reckoning he’d sold over 2.3 million booklets by McCabe between the 1920s and the 1940s.

The inner life she led as a teenager that Ehrenreich depicts in her memoir certainly resembles the world of rationality and order that McCabe evokes, and that her family took as a given: “[E]very visible, palpable object, every rock or grain of sand, is a clue to the larger mystery of how the universe is organized and put together -- a mystery that it was our job, as thinking beings, to solve.” Ehrenreich took up the challenge as an adolescent with particular rigor. Faced with the hard questions that death raises about the value and point of life, she began taking notes in the expectation of working out the answer: “I think it is best to start out with as few as possible things which you hold to be unquestionably true and start from there.”

The problem, as Descartes discovered and Ehrenreich did soon, was that “the unquestionably true” is a vanishingly small thing to determine. You are left with “I exist” and no path to any more inclusive confidence than that. Descartes eventually posits the existence of God, but arguably bends the rules in doing so. Ehrenreich does not and lands in the quicksand of solipsism.

The mature Ehrenreich can see how her younger self’s philosophical conflicts took shape in the midst of more mundane family problems involving alcoholism, career frustration and each parent’s set of inescapable disappointments. (After all, solipsism means never having to say you’re sorry.) But she also recognizes that the quandaries of her teenage prototype weren’t just symptoms: “Somehow, despite all the peculiarities of my gender, age class, and family background, I had tapped into the centuries-old mainstream of Western philosophical inquiry, of old men asking over and over, one way or another, what’s really going on here?”

In pursuing answers that never quite hold together, she undergoes what sounds very much like the sort of crisis described by the saints and mystics of various traditions. First, there were repeated moments of being overwhelmed by the sheer strangeness and “there”-ness of the world itself. Then, in May 1959, a few months before leaving for college, she underwent a shattering and effectively inexpressible experience that left her earlier musings in ashes. No consciousness-altering chemicals were ingested beforehand; given the circumstances, it is easy to appreciate why the author spent the 1960s avoiding them:    

“[T]he world flamed into life,” she writes. “There were no visions, no prophetic voices or visits by totemic animals, just this blazing everywhere. Something poured into me and I poured out into it. This was not the passive beatific merger with ‘the All,’ as promised by the Eastern mystics. It was a furious encounter with a living substance that was coming at me through all things at once, and one reason for the terrible wordlessness of the experience is that you cannot observe fire really closely without becoming part of it.”

This kind of thing could not be discussed without the risk of involuntary commitment, and Ehrenreich herself refers to a medical hypothesis suggesting that ecstatic states may result when “large numbers of neurons start firing in synchrony, until key parts of the brain are swept up in a single pattern of activity, an unstoppable cascade of electrical events, beginning at the cellular level and growing to encompass the entire terrain that we experience as ‘consciousness.’”

Ehrenreich did not take McCabe’s course in reverse -- going from atheist into the waiting arms of an established faith. For that matter, she remains more or less an agnostic, at least willing to consider the possible merits of a polytheistic cosmos. "My adolescent solipsism is incidental compared to the collective solipsism our species has embraced for the last few centuries in the name of modernity and rationality," she writes, "a worldview in which there exists no consciousness or agency other than our own, where nonhuman animals are dumb mechanisms, driven by instinct, where all other deities or spirits have been eliminated in favor of the unapproachable God...." Whether a nonreligious mysticism can go beyond "modernity and rationality" without turning anti-modern and irrationalist is something we'll take up in the column on a future occasion. 

Editorial Tags: 
Image Source: 
Hachette Book Group

U of Cincinnati to Start University Press

The University of Cincinnati announced Thursday that it is starting a university press, which it said would focus on social justice and community engagement. The press plans to publish both print and e-books, and also to support creative works in digital media, web-based digital scholarship, multi-authored databases, library special collections and archives. The press will operate in and be supported by the university's library system.

 

 

Ad keywords: 

Shimer Will Become Part of North Central College

North Central College is poised to acquire Shimer College under an agreement between Illinois institutions that leaders hope will have Shimer, a small four-year Great Books college, becoming its own division in a larger institution.

Shimer, which has an enrollment of about 70, and North Central, an independent liberal arts and sciences college with a combined undergraduate and graduate enrollment of more than 2,900, announced on Thursday that they intend to do a deal. The two institutions will now move forward with negotiating a final agreement intended to close at the beginning of March 2017. If successful, the move would create a Shimer Great Books School reporting directly to North Central’s provost for the fall 2017 term, said Troy D. Hammond, North Central president.  Goals are to expand the amenities Shimer can offer to students and grow its size modestly, he said. But North Central wants to keep Shimer’s identity.

 “If we weren’t going to do that, we wouldn’t be having the conversation,” Hammond said. “We recognize the strength and value of what’s unique about Shimer.”

 The move toward an acquisition comes a decade after Shimer decided to relocate from Waukegan, Ill. to lease space at the Illinois Institute of Technology in Chicago. Shimer’s lease in Chicago was not expiring, said the college’s president, Susan E. Henking. But the college and its trustees wanted to find a strategy that would preserve it for the future, she said.

“There are some things you can do as a tenant and some things you can do as a kind of partner to another institution,” Henking said. “We have a mission that says we should be small, but that’s a challenge in today’s environment. If we want to keep with our mission of very small classes and the kind of core curriculum we do, we’ve got to find a different structure.”

Shimer faculty would become faculty at North Central, and its students would become North Central students. The colleges said that would give them access to activities, arts and athletics.

 

Ad keywords: 

University of Florida, Elsevier explore interoperability in the publishing space

Smart Title: 

U of Florida connects its institutional repository to Elsevier's ScienceDirect platform to try to increase the visibility of the university's intellectual work.

Review of Terry Eagleton's "Culture"

If ideas are tools -- “equipment for living,” so to speak -- we might well imagine the culture as a heavily patched-up conceptual backpack that has been around the world a few times. It has been roughly handled along the way.

The stitches strain from the sheer quantity and variety of stuff crammed into it over the years: global culture, national culture, high culture, popular culture, classical and print and digital cultures, sub- and countercultures, along with cultures of violence, of affluence, of entitlement, of critical discourse …. It’s all in there, and much else besides. How it all fits -- what the common denominator might be -- is anyone’s guess. We could always draw on the useful clarifying distinction between: (1) culture as a category of more or less aesthetic artifacts, perhaps especially those that end up in museums and libraries, and (2) culture as the shared elements of a way of life.

The difference is, in principle, one of kind, not of quality, although assumptions about value assert themselves anyway. The first variety is sometimes called “the Matthew Arnold idea of culture,” after that Victorian worthy’s reference, in his book Culture and Anarchy, to “the best which has been thought and said.” Presumably music and painting also count, but Arnold’s emphasis on verbal expression is no accident: culture in his use of the term implies literacy.

By contrast “culture in the anthropological sense” -- as the second category is often called -- subsumes a good deal that can be found in societies without writing: beliefs about the nature of the world, ways of dressing, gender roles, assumptions about what may be eaten and what must be avoided, how emotions are expressed (or not expressed) and so on. Culture understood as a way of life includes rules and ideas that are highly complex though not necessarily transmitted through formal education. You absorb culture by osmosis, often through being born into it, and much of it goes without saying. (This raises the question of whether animals such as primates or dolphins may be said to have cultures. If not, why not? But that means digging through a whole other backpack.)

The dichotomy isn’t airtight, by any means, but it has served in recent years as a convenient pedagogical starting point: a way to get students (among others) to think about the strange ubiquity and ambiguity of culture as a label we stick on almost everything from the Code of Hammurabi to PlayStation 4, while also using it to explain quite a bit. Two people with a common background will conclude a discussion of the puzzling beliefs or behavior of a third party by agreeing, “That’s just part of their culture.” This seems more of a shrug than an explanation, really, but it implies that there isn’t much more to say.

One way to think of Terry Eagleton’s new book, Culture (Yale University Press), is as a broad catalog of the stuff that comes out when you begin unpacking the concept in its title -- arranging the contents along a spectrum rather than sorting them into two piles. In doing so, Eagleton, a distinguished professor of English literature at the University of Lancaster, follows closely the line of thought opened by the novelist and critic Raymond Williams, who coined the expression “culture as a whole way of life.” Williams probably derived the concept in turn, not from the anthropologists, but from T. S. Eliot. In distinguishing “culture as ordinary” (another Williams phrase) from culture as the work that artists, writers, etc. produce, the entire point was to link them: to provoke interest in how life and art communicated, so to speak.

For Williams, the operative word in “culture as a whole way of life” was, arguably, “whole”: something integral, connected and coherent, but also something that could be shattered or violated. Here, too, Eagleton is unmistakably Williams’s student. His assessment of how ideas about culture have taken shape over the past 200 years finds in them a pattern of responses to both industrialism (along with its spoiled heir, consumerism) and the French revolution (the definitive instance of “a whole way of life” exploding, or imploding, under its own strains). “If it is the cement of the social formation,” Eagleton writes, culture “is also its potential point of fracture.”

It may be that I am overemphasizing how closely Eagleton follows Williams. If so, it is still a necessary corrective to the way Williams has slowly turned into just another name in the Cultural Studies Hall of Fame rather than a felt moral and intellectual influence. His emphasis on culture as “a whole way of life” -- expressed with unabashed love and grief for the solidarity and community he knew when growing up in a Welsh mining community -- would sound remarkably anachronistic (if not ideologically totalizing and nostalgically uncritical) to anyone whose cultural reference points are of today’s commodified, virtual and transnational varieties.

And to that extent, Eagleton’s general survey of ideas about culture comes to a sharp point -- aimed directly at how the concept functions now in a capitalist society that he says, “relegates whole swaths of its citizenry to the scrap heap, but is exquisitely sensitive about not offending their beliefs.”

He continues, in a vein that Williams would have appreciated: “Culturally speaking, we are all to be granted equal respect, while economically speaking the gap between the clients of food banks and the clients of merchant banks looms ever larger. The cult of inclusivity helps to mask these material differences. The right to dress, worship or make love as one wishes is revered, while the right to a decent wage is denied. Culture acknowledges no hierarchies, but the educational system is riddled with them.” This may explain why culture is looking so raggedy and overburdened as a term. Pulled too tight, stretched too thin, it covers too many things that it would be difficult to face straight on.

Editorial Tags: 
Image Source: 
Yale University Press

Northern Illinois U Press fights to survive after being deemed 'nonessential'

Smart Title: 

Supporters of academic publishing worry about what Northern Illinois U may decide about a small press that punches above its weight in scholarship.

Review of Christina Crosby, "A Body, Undone: Living on After Great Pain (A Memoir)"

Somewhere along the way, Nietzsche’s apothegm “That which does not destroy me makes me stronger” lost all the irony and ambiguity it had in context and turned into an edifying sentiment -- a motivational catchphrase, even on the order of that poster saying, “Hang in there, baby!” with the cat clinging to a tree branch.

“Destroy” is often rendered “kill,” giving it a noirish and edgy sound. Either way, the phrase is uplifting if and only if understood figuratively, as a statement about mental resilience. For when taken literally, it is barely even half true, as a moment’s reflection reveals. A life-threatening virus can make us stronger -- i.e., immune to it in the future -- but a bullet to the brain never will. That truth would not have been lost on Nietzsche, who understood philosophy as a mode of psychology and both as rooted in physiology.

He expected the reader not just to absorb a thought but to test it, to fill in its outlines and pursue its implications -- including, I think, a contradictory thought: Whatever does not kill me might very well leave me wishing it had.

While riding her newly repaired bicycle early in the fall semester of 2003 -- pumping the pedals hard, with the strong legs of someone just entering her 50s and determined not to feel it -- Christina Crosby, a professor of English and feminist, gender and sexuality studies at Wesleyan University, got a branch caught in the front wheel. She went flying from her seat, landing on the pavement chin first, fracturing two vertebrae in her neck. The broken bone scraped her spinal cord. One indication of how fast it all happened is that reflexes to break a fall never kicked in. Her hands were not damaged at all.

“Serious neurological damage started instantly,” Crosby writes in A Body, Undone: Living On After Great Pain (NYU Press); “blood engorged the affected site, and the tissue around the lesion began to swell, causing more and more damage as the cord pressed against the broken vertebrae. I also smashed my chin into tiny pieces, tore open my lips, slashed open my nose, breaking the cartilage, and multiply fractured the orbital bone underneath my right eye.” She had been wearing wire-frame glasses and the force of the impact drove the metal into the bridge of her nose.

Crosby spent three weeks in the hospital, unconscious in intensive care for most of it, and only found out later, from her lover, Janet, that the neurosurgeons and plastic surgeons “debated who should go first.” The plastic surgeons won. It sounds as if they had proportionately the more hopeful and effective job to do -- piecing together her chin from fragments, reconstructing her mouth, removing the eyeglass frames from her flesh and leaving only a half-moon scar.

The neurological damage was much more extensive and included both paralysis and a loss of sensation from the neck down. In time, Crosby regained limited use of her hands and arms and could begin to overcome the extreme (and dangerous) atrophy that set in following the accident. She was able to return to teaching part time at Wesleyan in 2005.

The author refers to herself dictating the memoir, but it feels very much as a piece of writing -- that is, as something composed in large part through revision, through grappling with the enormous problem of communicating sequences of experience and thought that few readers will have shared. The accident occurred relatively late in her life and without warning; the contrast between her existence before and after the catastrophic event is made even starker by the fact that she cannot remember it happening. “My sense of a coherent self,” she writes, “has been so deeply affronted” that the book in large measure serves as a way to try to put the fragments back together again without minimizing the depth of the chasm she has crossed.

“You become who you are,” Crosby writes, “over the course of a life that unfolds as an ongoing interaction with objects and others, from the infant you once were, whose bodily cartography slowly emerged as you were handled by caregivers whose speech washed over you, to the grown-up you are today, drawn beyond reason to one person rather than another.”

On that last point she has been extraordinarily fortunate in whom she found herself drawn to: the bond she shares with Janet seems like a rope across the abyss, or more like a steel cable, perhaps. (I call her by her first name simply because the author does. The view from Janet R. Jakobsen’s side of things may be read in a thoughtful essay from 2007.) At the same time, A Body, Undone is anything but sentimental about the possibilities of growth and healing. As doctors lowered the dosage of Crosby’s painkillers, new forces imposed themselves on her daily life:

“I feel an unassuageable loneliness, because I will never be able to adequately describe the pain I suffer, nor can anyone accompany me into the realm of pain …. Pain is so singular that it evades direct description, so isolating because in your body alone. Crying, and screaming, and raging against pain are the signs of language undone. … I have no exact account of how pain changes my interaction with my students and my colleagues, but I know there are times when I don’t feel fully present. It’s not that the pain is so bad that it demands all my attention, but rather that it’s so chronic as to act like a kind of screen.”

No pseudo-Nietzschean bromides to be had here. There is also the difficult new relationship with one’s bowels when they cease to be under any control by the brain -- the discovery of a whole terra incognita beyond ordinary feelings of awkwardness and embarrassment. Crosby discusses her reality with a candor that must be experienced to be believed. And the reader is left to face the truth that one’s embodiment (and the world that goes with it) can change utterly and forever, in a heartbeat.

Editorial Tags: 
Image Source: 
NYU Press

Author discusses new book about class inequality at an elite university

Smart Title: 

Author discusses new book about class inequality at an elite university.

Pages

Subscribe to RSS - books
Back to Top