Standing in line at the drugstore a couple of weeks ago, I spied on the magazine rack nearby this month’s issue of National Geographic – conspicuous as one of the few titles without a celebrity on the cover. Instead it showed a photograph of an infant beneath a headline saying "This Baby Will Live to Be 120."
The editors must have expected disbelief, because there was a footnote to the headline insisting that the claim was not hype: "New science could lead to very long lives." When was the last time you saw a footnote in a popular periodical, on the cover, no less? It seemed worth a look, particularly after the septuagenarian in front of me had opened complex, in-depth negotiations with the pharmacist.
The headline, one learns from a comment on the table of contents, alludes to a traditional Jewish birthday wish or blessing: "May you live to be 120." This was the age that Moses was said to have reached when he died. The same figure appears -- not so coincidentally perhaps – at an important moment in the book of Genesis. Before sending the Flood, Jehovah announces that man’s lifespan will henceforth peak at 120 years. (I take it there was a grandfather clause for Noah. When the waters recede, he lives another 350 years.)
The cap on longevity, like the deluge itself, is ultimately mankind’s own fault, given our tendency to impose too much on the Almighty’s patience and good humor. He declares in about so many words that there is a limit to how much He must endure from any single one of us. Various translations make the point more or less forcefully, but that’s the gist of it. Even 120 years proved too generous an offer – one quietly retracted later, it seems. Hence the Psalmist’s lament:
“The days of our years are threescore years and ten; and if by reason of strength they be fourscore years, yet is their strength labor and sorrow; for it is soon cut off, and we fly away.”
Nursing homes are full of people who passed the fourscore marker a while ago. If you visit such places very often, as I have lately, “May you live to be 120” probably sounds more like a curse than a blessing. Not even a funeral obliges more awareness of mortal frailty. There is more to life than staving off death. The prospect of being stranded somewhere in between for 30 or 40 years is enough to make an atheist believe in hell.
Meanwhile, in science…. The medical and biological research surveyed in that NatGeoarticle promises to do more than drag out the flesh’s “labor and sorrow” a lot longer. The baby on the magazine cover will live his or her allotted span of six score decades with an alert mind, in a reasonably healthy body. Our genetic inheritance plays a huge but not absolutely determinate role in how long we live. In the wake of the mapping of genome, it could be possible to tinker with the mechanisms that accelerate or delay the aging process. It may not be the elixir of youth, but close enough.
Besides treating the same research in greater depth, Ted Anton’s The Longevity Seekers: Science, Business, and the Fountain of Youth (University of Chicago Press) emphasizes how profound a change longevity research has already wrought. It means no longer taking for granted the status of aging as an inescapable, biologically hardwired, and fundamentally irreversible process of general decline. Challenging the stereotypes and prejudices about the elderly has been a difficult process, but longevity engineering would transform the whole terrain of what aging itself entails.
Anton, a professor of English at DePaul University, tells the story in two grand phases. The first bears some resemblance to James Watson’s memoir The Double Helix, which recounts the twists and turns of laboratory research in the struggle to determine the structure of DNA – work for which he and Francis Crick received a Nobel Prize in medicine in 1962. Watson’s book is particularly memorable for revealing science as an enterprise in which personalities and ambitions clash as much as theories ever do. (And with far more rancor as Watson himself demonstrated in the book’s vicious and petty treatment of Rosalind Franklin, a crystallographer whose contribution he downplayed as much as possible.)
A practitioner of long-form journalism rather than a longevity researcher, Anton writes about conflicts in the field with some detachment, even while remaining aware that the discoveries may change life in ways we can’t yet picture. The initial phase of the research he describes consisted largely of experiments with yeast cells and microscopic worms conducted in the 1990s. Both are short-lived, meaning that the impact of biochemical adjustments to their genetic “thermostats” for longevity would register quickly.
During the second phase of Anton’s narrative, lab research involved more complex organisms. But that that was not the most important development. The public began hearing news flashes that scientists had discovered that the key to a longer life was, say, restricted caloric intake, or a chemical called resveratrol found in red wine. Findings presented in scientific journals were reported on morning news programs, or endorsed on Oprah, within days or even hours of publication. Hypotheses became hype overnight.
This generated enthusiasm (more for drinking red wine than restricting calories, if memory serves) as well as additional confidence that biotechnological breakthroughs were on the way. Everybody in longevity research, or almost everybody, started a company and ran around looking for venture capital. Models, evidence, and ideas turned proprietary information -- with the hurry to get one’s findings into professional journals looking more and more like the rush to issue a press release.
So far, no pharmaceutical has arrived on the market to boost our lifespans as dramatically as the worm and yeast cells in the laboratory worms. “The dustbin of medical breakthroughs,” Anton reminds us, “bears the label ‘It Worked in Mice.’ ” On the other hand, the research has been a boon to the cosmetics industry.
As it is, we’re nowhere near ready to deal with the cumulative effect of all the life-extending medical developments from the past few decades. The number of centenarians in the world “is expected to increase tenfold between 2010 and 2050,” the author notes, “and the number of older poor, the majority of them women,” is predicted “to go from 342 million today to 1.2 billion by that same year.”
But progress is ruthless about doing things on its own terms. Biotech is still in its infancy, and its future course -- much less its side effects -- is beyond imagining. The baby on the magazine cover might well live to see the first centenarian win an Olympic medal. I wish that prospect were more cheering than it is.
Undergraduate students should join professors in selecting the content of courses taught in the humanities.
This is the conclusion I came to after teaching Humanities on Demand: Narratives Gone Viral, a pilot course at Duke University that not only introduced students to some of the critical modes humanists employ to analyze new media artifacts, but also tested the viability of a new, interactive course design. One semester prior to the beginning of class, we asked 6,500 undergraduates -- in other words, Duke¹s entire undergraduate student body -- to go online and submit materials they believed warranted examination in the course.
Submissions could be made regardless of whether a student planned on enrolling in the course. In response, hundreds of students from a variety of academic disciplines, including engineering, political science, religion, foreign languages, anthropology, public policy and computer science, submitted content for the class.
This interactive approach, which I call Epic Course Design (ECD) after German playwright Bertolt Brecht’s theory of epic theater, represents a radical break with traditional course-building techniques. Generally, humanities instructors unilaterally choose the content of their syllabuses -- and rightly so. After all, we are the experts. But this solitary method of course construction does not reflect how humanists often actually teach.
Far from being viewed as passive receptacles of instructional data, humanities students are often engaged as active contributors. With this in mind, ECD offers a student-centered alternative to traditional course-building methods. Importantly, ECD does not allow students to dictate the content of a course; it invites them to contribute, with the instructor ultimately deciding which (if any) student-generated submissions merit inclusion on the syllabus.
Nevertheless, when a colleague of mine first heard about my plans to allow students to determine what was to be examined in Narrative Gone Viral, he was deeply skeptical: "But students don¹t know what they don’t know," he objected. In my view, that is not a problem -- that is the point; or at least part of it. For crowdsourcing the curriculum not only invites students to submit material they are interested in, but also invites them to choose material they believe they already understand. Student-generated submissions for Narratives Gone Viral included popular YouTube videos like "He-Man sings 4 Non Blondes,""Inmates Perform Thriller" and "Miss Teen USA 2007- South Carolina answers a Question." While my students were already exceedingly familiar with these videos, they clearly didn’t always see what was at stake in them.
All of these works are worthy of academic scrutiny: the "He-Man" piece is interesting because it confronts preconceived notions of masculinity; "Inmates Perform Thriller" prompts questions of accessibility to social media; "Miss Teen USA" is notable because it reveals how viral videos often appeal to a viewer’s desire to feel superior to others.
I am not proposing that all humanities courses should integrate this approach. What I am suggesting, however, is that ECD represents a viable alternative to more familiar course-building methodologies. This includes classes that do not focus on social media and/or popular culture. Importantly, whether students will be interested in suggesting texts for, say, a course on medieval German literature is not the crucial question; in my view, the crucial question is: Why should we refrain from offering motivated students the opportunity to do so, if they wish?
There was relatively little repetition in student submissions for Narratives Gone Viral, an indication that students were reviewing posts made by their peers, weighing their options, and responding with alternative suggestions.
To put a finer point on the matter, students were not merely submitting course content: they were discussing the content of a course that -- in every traditional sense -- had yet to even begin.
Michael P. Ryan is a visiting assistant professor of German studies and the American Council of Learned Societies new faculty fellow at Duke University.
In "Howl," a blistering poetical rant and perhaps the most important poem of the 60’s counterculture, Allen Ginsberg anatomizes the minds of his generation. They are young men and women who "studied Plotinus Poe St. John of the Cross telepathy and bop kabbalah because the cosmos instinctively vibrated at their feet in Kansas." When students come to our offices to consider studying the humanities, we can all recite the litany of reasons for doing so. It provides them with the critical thinking skills needed for success in any career; it endows them with the cultural capital of the world’s great civilizations; and it helps them explore what it means to be human.
But for those of us who have spent our lives studying the humanities, such reasons are often just the fossilized remains of the initial impulse that set us on our educational journey -- the feeling that Kansas was vibrating at our feet, and that to chart our futures we desperately needed to understand the meaning of that vibration.
The main challenge for the humanities teacher has always been to show how the great works of philosophy, literature, religion, history, and art answer to the good vibrations in our young people. But at the dawn of the 21st century the academic scaffolding of the humanities thwarts this fundamental goal. The central problem is that the Harvard University model of humanistic study dominates academia.
The Harvard model sees the humanities as a set of distinct and extensively subdivided disciplines, overseen by hyper-specialized scholars who produce disciplinary monographs of extraordinary intellectual subtlety and technical expertise. Though the abstruse work produced with this model periodically makes it the butt of media jokes, no one with an appreciation for good scholarship would want to eliminate the rigorous discipline represented by the work of scholars at Harvard and institutions like it. But neither should it be allowed to dominate the agenda of all higher education, which it now incontestably does, to the detriment of both the humanities and the students who want to understand the meaning of their unique vibration.
The disciplining of knowledge was central to the creation of the modern research university. In the second half of the 19th century, Harvard and then schools across the academic landscape dropped their common curriculum, creating instead departments and majors. Beginning with the natural sciences of physics, chemistry, and biology, this flowering of disciplines issued in countless discoveries and insights with repercussions far beyond the university. Flushed with this success, this triumph of knowledge production, and the 19th century scientific methodology that was its seed, spread to the examination of society. The newly-invented social sciences -- economics, sociology, anthropology and the like — grabbed hold of the explosive new problems that followed in the wake of modern industrial life. But at the same time they marginalized the traditional questions posed in the humanities. The social sciences raised "humanistic" questions within the strictures of 19th century positivist assumptions about scientific "objectivity," and they have been doing so, despite post-modern blows dealt to claims of objectivity, ever since.
As the natural and social sciences divided the world between themselves the humanities threatened to become a mere leftover, a rump of general reflections and insights that lacked the rigor of the special sciences. Eager to be properly scientific themselves, and thereby forestall such a humiliating fate, the humanities disciplined themselves. They sought to emulate the success of the sciences by narrowing their intellectual scope, dividing and subdividing their disciplines into smaller and ever smaller scholarly domains, and turning themselves into experts.
The norm became the creation of inward-looking groups of experts who applied a variety of analytic approaches to sets of increasingly technical problems. In short, the humanities found themselves squeezed by the demands for professionalization and disciplinization, the need to become another regional area of study analogous in form, if not in content, to the other special sciences. And the humanities have been content to play this disciplinary game ever since.
In the last 30 years, the rise of Theory promised to breathe a new, post- modern life into this disciplinary game. By the mid-20th century, the sterility of old fashioned explication de texte was becoming apparent. The linguistic turn opened up a new way for the humanists to ape the rigor of the sciences while simultaneously extending their scholarly turf. In their zeal for technical rigor, they discovered to their delight that texts marvelously shift shape depending upon the theoretical language used in their analyses. Into the moribund body of the humanities flowed the European elixirs of psychoanalysis, phenomenology and hermeneutics, structuralism and post-structuralism, all of which boasted technical vocabularies that would make a quantum physicist blush. With these languages borrowed from other disciplines, the great books of the Western tradition looked fresh and sexy, and whole new fields of scholarship opened up overnight.
At the same moment, however, scholars of the humanities outside the graduate departments of elite universities suddenly found themselves under-serving their students. For the impulse that drives young people to the humanities is not essentially scholarly. The cult of expertise inevitably muffles the jazzy, beating heart of the humanities, and the students who come to the university to understand their great vibration return home unsatisfied. Or worse, they turn into scholars themselves, funneling what was an enormous intellectual curiosity through the pinhole of a respectable scholarly specialty.
Indeed, their good vibrations fade into a barely discernable note, a song they recall only with jaded irony, a sophisticated laugh at the naiveté of their former selves, as if to go to school to learn the meaning of their own lives were an embarrassing youthful enthusiasm. The triumph of irony among graduate students in the humanities, part of the deformation professionelle characteristic of the Harvard virus, exposes just how far the humanities have fallen from their original state. As they were originally conceived, the humanities squirm within the research paradigm and disciplinary boxes at the heart of the Harvard model.
The term "humanities" predates the age of disciplinary knowledge. In the Renaissance, the studia humanitatis formed part of the attempt to reclaim classical learning, to serve the end of living a rich, cultivated life. Whether they were contemplative like Petrarch or engaged like Bruni, Renaissance humanists devoted themselves to the study of grammar, rhetoric, logic, history, literature, and moral philosophy, not simply as scholars, but as part of the project of becoming a more complete human being.
Today, however, the humanities remain entrenched in an outmoded disciplinary ideology, wedded to an academic model that makes it difficult to discharge this fundamental obligation to the human spirit. Despite the threat of the Great Recession, the rise of the for-profit university, and a renewed push for utility the humanities continue to indulge their fetish of expertise and drive students away. Some advocate going digital, for using the newest techno and cyber techniques to improve traditional scholarly tasks, like data-mining Shakespeare. Others turn to the latest discoveries in evolutionary psychology to rejuvenate the ancient texts. But both of these moves are inward looking — humanists going out into the world, only to return to the dusty practices that have led the humanities to their current cul-de-sac. In so doing, colleges and univeristies across the country continue to follow the Harvard model: specialize, seek expertise, and turn inward.
When Descartes and Plotinus and Poe and St. John of the Cross created their works of genius, they were responding not to the scholar’s task of organizing and arranging, interpreting and evaluating the great works of the humanistic tradition, but rather to their own Kansas. Descartes and Rousseau were latter-day Kerouacs, wandering Europe in search of their souls. These men and women produced their works of genius through a vibrant, vibrating attunement to the needs of their time.
The Humanities! The very name should call up something wild. From the moment Socrates started wandering the Greek market and driving Athenian aristocrats to their wits end, their place has always been out in the world, making connections between the business of living and the higher reaches of one’s own thought, and drawing out implications from all that life has to offer. The genius of the humanities lies in the errant thought, the wild supposition, the provocation -- in Ginsburg’s howl at society. What this motley collection of disciplines is missing is an appreciation of the fact that the humanities have always been undisciplined, that they are essentially non-disciplinary in nature. And if we want to save them, they have to be de-disciplined and de-professionalized.
De-disciplining the humanities would transform both the classroom and the curriculum. Disengaging from the Harvard model would first and foremost help us question the assumption that a scholarly expert in a particular discipline is the person best suited to teaching the subject. The quality that makes a great scholar — the breadth and depth of learning in a particular, narrow field — does not make a great teacher; hungry students demand much more than knowledge. While the specialist is hemming himself in with qualifications and complications, the broadly-educated generalist zeros in on the vital nub, the living heart of a subject that drives students to study.
While a scholarly specialist is lecturing on the ins and outs of Frost’s irony, the student sweats out his future, torn between embracing his parent’s dream of having a doctor in the family or taking the road less traveled and becoming a poet. The Harvard model puts great scholars in charge of classrooms that should be dominated by great teachers. And if the parents who are shelling out the price of a contemporary college education knew their dollars were funding such scholarly hobbyhorses, they would howl in protest.
De-disciplining the humanities would also fundamentally change the nature of graduate and undergraduate education. At the University of North Texas Department of Philosophy and Religious Studies, located in the Dallas Metroplex, we are training our graduate students to work with those outside their discipline — with scientists, engineers, and policy makers — to address some of the most pressing environmental problems the country faces. We call it field philosophy: taking philosophy out into the world to hammer out solutions to highly complex and pressing social, political, and economic problems. Graduate students participate in National Science Foundation grants and practice the delicate skill of integrating philosophic insights into public policy debates, often in a "just-in-time" manner. In class they learn how to frame and reframe their philosophical insights into a variety of rhetorical formats, for different social, political, economic purposes, audiences and time constraints.
At Calumet College of St. Joseph, an urban, Roman Catholic commuter college south of Chicago that serves underprepared, working-class Hispanic, African-American, and Anglo students, we are throwing the humanities into the fight for social justice. Here the humanities are taught with an eye toward creating not a new generation of scholars, but a generation of humanely educated citizens working to create a just society. At Calumet, students are required to take a social justice class.
In it they learn the historical and intellectual roots of Catholic social justice teaching within the context of performing ten hours of community service learning. They work in a variety of social service fields (e.g. children, the elderly, homeless, etc.), which exposes them to the real-life, street-level experience of social challenges. Before, during, and after, students bring this experience back to the classroom to deepen it through reflective papers and class discussion.
High-level humanistic scholarship will always have a place within the academy. But to limit the humanities to the Harvard model, to make scholarship rather than, say, public policy or social justice, the highest ideal of humanistic study, is to betray the soul of the humanities. To study the humanities, our students must learn textual skills, the scholarly operations of reading texts closely, with some interpretive subtlety. But the humanities are much more than a language game played by academic careerists.
Ultimately, the self-cultivation at the heart of the humanities aims to develop the culture at large. Unless they end up where they began -- in the marketplace, alongside Socrates, questioning, goading, educating, and improving citizens -- the humanities have aborted their mission. Today, that mission means finding teachers who have resisted the siren call of specialization and training undergraduate and graduate students in the humanities in the art of politics.
The humanist possesses the broad intellectual training needed to contextualize social problems, bring knowledge to bear on social injustice, and translate disciplinary insights across disciplines. In doing so, the humanist helps hold together an increasingly disparate and specialized society. The scholasticism of the contemporary academy is anathema to this higher calling of the humanities.
We are not all Harvard, and nor should we want to be.
ChrisBuczinsky is head of the English program at Calumet College of St. Joseph in Whiting, Indiana. Robert Frodeman is professor of philosophy and director of the Center for the Study of Interdisciplinarity at the University of North Texas.
Books abound about student disengagement. We read about their apathy and indifference to the world around them. Data, sadly, support these claims. Youth voting rates are low, especially when President Obama isn’t on the ballot, and while there is some partaking in community activities, critics have noted that some of this engagement is the product of high schools "mandating" volunteerism as a graduation requirement.
My experiences – both as a political scientist and as a dean of the school of liberal arts at the Savannah College of Art and Design – suggest that we administrators and professors doth protest too much. Give our students a compelling text and topic, and they will engage.
I recently visited a philosophy class in which Plato’s Republic was assigned. The students were tackling Book Six, where questions spill off the pages about who should rule, and what qualities make for a viable ruler. Can a "rational" person, removed from impulses and passions, command and lead? How can, or should one remove oneself from temptation and emotion? Can the rational and emotive be separated? Do citizens trust those who are like them? How much of leading and governing is about the rational, and how much is about appearances and images?
As the professor and I raised these questions, I noticed immediately that the students had done the reading. We administrators read about how today’s students do not read. But these students – all of whom were non-liberal arts majors – had immersed themselves in the text. They were quoting passages and displaying keen interest, both in the text itself and the questions that were being raised. It is not surprising that Plato enlivened the classroom. But these future artists and designers recognized the power of the text. They appreciated how the words had meaning, and the questions were worth exploring.
Second, this experience, and others like it, gave me pause. We administrators may need to tweak our conceptions of our students. Sure, Academically Adrift is an important book, and yes, the data show that the degree of reading comprehension has declined. But we should not misconstrue that data as tantamount to disengagement, nor should we assign fewer readings, simply imply because there are data that show many students do not complete reading assignments. This recommendation – of assigning less reading and teaching it in greater depth – was one of the suggestions made by José Antonio Bowen, author of Teaching Naked, in his dynamic and imaginative keynote address at this year’s annual meeting of the Association of American Colleges and Universities.
The point here is not to debate Bowen’s recommendation – that is for another time and place. Similarly, I am well aware that this experience in Philosophy 101 may be unique, and is dubiously generalizable. (I should add that encountering students who are excited about discussing big ideas also occurs in other classrooms -- photography and art history, for example, that I have visited as well.)
This enthusiasm is not a recipe for assigning Plato in every class, although that is an idea that most definitely would generate discussion. That written, I believe that we should reconsider how we administrators and educators think about student engagement. It is more than knowledge about civics and current events. It is bigger and deeper than service learning, or a passion to work in one’s community.
Provide students with a compelling text and a professor who knows how to raise thought-provoking questions, and students will ponder, debate and imagine the world in new and different ways. They will learn how to think critically and creatively. Cultivating that form of student engagement is no easy task, but it begins by exposing students to great texts and great ideas. Engagement is more than a form of political participation. It is the core of the liberal arts.
Robert M. Eisinger is dean of the School of Liberal Arts at the Savannah College of Art and Design.
When I was young I trained as an actor and as a reader of poetry, particularly metered verse. I’m accustomed to delivering keynotes and making other kinds of presentations. I also have experience as a singer — I was in a reggae band in Britain when I was young, though perhaps this isn’t the best testament to my abilities. As soon as I left the band had two top-20 hits in a row.
I felt passionate about narrating this book because it is not only an analysis of such things as the vulnerability of the education system and the easy accessibility of guns, it is also a deeply personal account of my experience before, during, and after the rampage attack on the campus. Though I reported the English department’s concerns about Seung-Hui Cho to many units on campus, and though he did eventually seek help from campus counseling, at his hands, we still experienced the worst college shooting in history. I wrote the book because it seemed inevitable that other attacks would occur, especially if we as a nation didn’t learn from the errors and missteps of the past.
It was my responsibility to utter the words I had written. Honest, open communication is the only meaningful gift we have to give to those who lost loved ones in the attack. At the end of the book I apologize to the victims' families for not being able to prevent this horror. How could someone say these words on my behalf? It’s not the kind of responsibility you can delegate.
It wasn’t until it was confirmed I could narrate the book that I realized I didn’t know if I could do it. Although I had read excerpts from it during readings and keynotes, reading the book aloud all the way through was a different matter altogether. What would I do when I reached the part about how I learned the perpetrator was an English major with whom I had worked? How would I get through the chapter "A Boy Named Loser," ("Loser" was Cho’s own name for himself) — a chapter in which I describe Cho’s agonized, menacing silence as he sat in my office, wearing his reflective sunglasses indoors?
But there was no point in focusing on what ifs. Best take the bull by the horns, my late mother would have said, her remembered voice always a source of consolation. We would be recording for six hours a day for about a week. In readiness, I bought honey-and-lemon throat lozenges and made a strong flask of mint tea. To warm up my voice on the 10-minute drive to the studio, I sang songs from The Sound of Music: "Edelweiss," "My Favorite Things" and "I Have Confidence in Me" — which I didn’t. Nevertheless, I sang with gusto, trusting in the power of Rodgers and Hammerstein and the fact that my car was soundproof.
I must have looked more eccentric than usual as I drove along Blacksburg’s winding country roads doing an impersonation of Fraulein Maria. But it made the process less daunting — as if I were still the homely-looking Anglo-Jamaican girl who used to belt out songs like "Singin’ in the Rain" as she trudged through a downpour on the way home from convent school; as if I were still the person I was before tragedy almost felled me like a tree, and, for a time at least, robbed me of the ability to sing at all.
Originally, I was supposed to travel to Maryland or DC to do the recording — a four-and-a-half hour drive from Blacksburg. It would mean staying in a hotel far from the comforts of home. I didn’t look forward to it. But Bruce Kitovich, the producer Audible assigned me, went out of his way to find a studio here in Blacksburg. It was a thoughtful thing to do, and it allowed me to meet Earl Norris, musician, owner, and operator of Four Loud Barks studio. As it turned out, Earl’s wife and my husband knew each other from way back. As soon as I entered Earl’s studio I felt at ease.
Apparently, according to Bruce, it’s not as unusual as it used to be to have authors read their own work. Not surprisingly, perhaps, this is especially true for memoir. If the recording is straightforward — and it often is for nonfiction — Audible’s in-house team checks the finished recording for quality control and decides which sentences need to be re-recorded. It’s a relatively speedy and efficient process that takes a matter of weeks.
We began recording the book on Sunday January 13, a mere three weeks after the heartbreaking attack at Sandy Hook Elementary in Connecticut. Early the next morning, my husband’s mother died. Mama Edna was a lovely woman — the kind of mother-in-law you hope you will be blessed with. I was even more concerned I wouldn’t be able to get through the recording without becoming emotional. Surprisingly, however, the process was one of the most calming experiences I have ever had.
Something strange happens when you record your own book. The relationship you have to your own words shifts and alters. You deliver a sentence in a particular way, stumble, then reread it with a completely different emphasis, one that can catch you by surprise. You hadn’t realized that was what you meant when you wrote the sentence, but it’s suddenly clear that of course that’s what you were trying to say. You are speaker and auditor, author and interpreter. You hear words anew.
I sat alone in Earl’s room-sized studio, a ribbon microphone a few inches from my mouth. I had been experiencing severe back pain for several weeks, so I sat in a chair with my elbows propped up by fat green pillows. This brought the book closer to me and meant I didn’t have to bear its physical weight while I read. Through the headphones my voice came back to me as not quite mine, as if someone else — a close relative, my mother, perhaps?— was speaking. It’s a strange sensation. Having done many radio interviews, I was accustomed to the aural intimacy of exceptionally sensitive headphones, but it was different this time. I was able to read the book out loud all the way through because it was a disembodied voice doing the reading — a projection and personification of sorts.
It occurs to me now that this process is much like the writing process we engage in as poets and novelists. When we teach creative writing, we talk about finding our voices, not simply because we want to assert our own identities, but because the voice is the guide leading us to the next place. It finds us when we’re lost and puts us back on the path towards revelation. Or at least we hope it does. Though I was in the Four Loud Barks studio on the other side of Earl’s garage while Earl was in the basement of his house several rooms away, I wasn’t alone. Not only did I have my own voice to keep me company, I had Earl’s voice, too, coming through the headphones. He listened intently to what I read and made sure it sounded O.K.
Today -- April 16, 2013 -- marks the six-year anniversary of the Virginia Tech shootings. In interviews I am often asked whether or not I have been able to move on from what happened. I try to explain that you don’t move on completely from calamities like this. What you can do with the help of friends and loved ones, however, is find a way to reconcile yourself to what has happened — or maybe, if you’re lucky, a way back to laughter again. In my case, I had to find a path toward forgiveness not only of the perpetrator but also of myself for not being able to prevent such a terrible tragedy from occurring in my beloved community. It is a long arduous journey — one I have to admit I am still on. I am grateful for the voices that accompany us, grateful that they serve to remind us of the world’s steadfast, indestructible beauty.