When I was young I trained as an actor and as a reader of poetry, particularly metered verse. I’m accustomed to delivering keynotes and making other kinds of presentations. I also have experience as a singer — I was in a reggae band in Britain when I was young, though perhaps this isn’t the best testament to my abilities. As soon as I left the band had two top-20 hits in a row.
I felt passionate about narrating this book because it is not only an analysis of such things as the vulnerability of the education system and the easy accessibility of guns, it is also a deeply personal account of my experience before, during, and after the rampage attack on the campus. Though I reported the English department’s concerns about Seung-Hui Cho to many units on campus, and though he did eventually seek help from campus counseling, at his hands, we still experienced the worst college shooting in history. I wrote the book because it seemed inevitable that other attacks would occur, especially if we as a nation didn’t learn from the errors and missteps of the past.
It was my responsibility to utter the words I had written. Honest, open communication is the only meaningful gift we have to give to those who lost loved ones in the attack. At the end of the book I apologize to the victims' families for not being able to prevent this horror. How could someone say these words on my behalf? It’s not the kind of responsibility you can delegate.
It wasn’t until it was confirmed I could narrate the book that I realized I didn’t know if I could do it. Although I had read excerpts from it during readings and keynotes, reading the book aloud all the way through was a different matter altogether. What would I do when I reached the part about how I learned the perpetrator was an English major with whom I had worked? How would I get through the chapter "A Boy Named Loser," ("Loser" was Cho’s own name for himself) — a chapter in which I describe Cho’s agonized, menacing silence as he sat in my office, wearing his reflective sunglasses indoors?
But there was no point in focusing on what ifs. Best take the bull by the horns, my late mother would have said, her remembered voice always a source of consolation. We would be recording for six hours a day for about a week. In readiness, I bought honey-and-lemon throat lozenges and made a strong flask of mint tea. To warm up my voice on the 10-minute drive to the studio, I sang songs from The Sound of Music: "Edelweiss," "My Favorite Things" and "I Have Confidence in Me" — which I didn’t. Nevertheless, I sang with gusto, trusting in the power of Rodgers and Hammerstein and the fact that my car was soundproof.
I must have looked more eccentric than usual as I drove along Blacksburg’s winding country roads doing an impersonation of Fraulein Maria. But it made the process less daunting — as if I were still the homely-looking Anglo-Jamaican girl who used to belt out songs like "Singin’ in the Rain" as she trudged through a downpour on the way home from convent school; as if I were still the person I was before tragedy almost felled me like a tree, and, for a time at least, robbed me of the ability to sing at all.
Originally, I was supposed to travel to Maryland or DC to do the recording — a four-and-a-half hour drive from Blacksburg. It would mean staying in a hotel far from the comforts of home. I didn’t look forward to it. But Bruce Kitovich, the producer Audible assigned me, went out of his way to find a studio here in Blacksburg. It was a thoughtful thing to do, and it allowed me to meet Earl Norris, musician, owner, and operator of Four Loud Barks studio. As it turned out, Earl’s wife and my husband knew each other from way back. As soon as I entered Earl’s studio I felt at ease.
Apparently, according to Bruce, it’s not as unusual as it used to be to have authors read their own work. Not surprisingly, perhaps, this is especially true for memoir. If the recording is straightforward — and it often is for nonfiction — Audible’s in-house team checks the finished recording for quality control and decides which sentences need to be re-recorded. It’s a relatively speedy and efficient process that takes a matter of weeks.
We began recording the book on Sunday January 13, a mere three weeks after the heartbreaking attack at Sandy Hook Elementary in Connecticut. Early the next morning, my husband’s mother died. Mama Edna was a lovely woman — the kind of mother-in-law you hope you will be blessed with. I was even more concerned I wouldn’t be able to get through the recording without becoming emotional. Surprisingly, however, the process was one of the most calming experiences I have ever had.
Something strange happens when you record your own book. The relationship you have to your own words shifts and alters. You deliver a sentence in a particular way, stumble, then reread it with a completely different emphasis, one that can catch you by surprise. You hadn’t realized that was what you meant when you wrote the sentence, but it’s suddenly clear that of course that’s what you were trying to say. You are speaker and auditor, author and interpreter. You hear words anew.
I sat alone in Earl’s room-sized studio, a ribbon microphone a few inches from my mouth. I had been experiencing severe back pain for several weeks, so I sat in a chair with my elbows propped up by fat green pillows. This brought the book closer to me and meant I didn’t have to bear its physical weight while I read. Through the headphones my voice came back to me as not quite mine, as if someone else — a close relative, my mother, perhaps?— was speaking. It’s a strange sensation. Having done many radio interviews, I was accustomed to the aural intimacy of exceptionally sensitive headphones, but it was different this time. I was able to read the book out loud all the way through because it was a disembodied voice doing the reading — a projection and personification of sorts.
It occurs to me now that this process is much like the writing process we engage in as poets and novelists. When we teach creative writing, we talk about finding our voices, not simply because we want to assert our own identities, but because the voice is the guide leading us to the next place. It finds us when we’re lost and puts us back on the path towards revelation. Or at least we hope it does. Though I was in the Four Loud Barks studio on the other side of Earl’s garage while Earl was in the basement of his house several rooms away, I wasn’t alone. Not only did I have my own voice to keep me company, I had Earl’s voice, too, coming through the headphones. He listened intently to what I read and made sure it sounded O.K.
Today -- April 16, 2013 -- marks the six-year anniversary of the Virginia Tech shootings. In interviews I am often asked whether or not I have been able to move on from what happened. I try to explain that you don’t move on completely from calamities like this. What you can do with the help of friends and loved ones, however, is find a way to reconcile yourself to what has happened — or maybe, if you’re lucky, a way back to laughter again. In my case, I had to find a path toward forgiveness not only of the perpetrator but also of myself for not being able to prevent such a terrible tragedy from occurring in my beloved community. It is a long arduous journey — one I have to admit I am still on. I am grateful for the voices that accompany us, grateful that they serve to remind us of the world’s steadfast, indestructible beauty.
“Before the Freedom of Information Act,” Henry Kissinger told a gathering of diplomats in Turkey in March 1975, “I used to say at meetings, ‘The illegal we do immediately; the unconstitutional takes a little longer.’ But since the Freedom of Information Act, I'm afraid to say things like that.”
Not that afraid, obviously. The Machiavellian quip got a laugh at the time, according to the official transcript -- and clearly it merits a spot in any future collection of familiar quotations, alongside Kissinger’s remark about power being the ultimate aphrodisiac. For now, it serves as the epigraph to a press release from WikiLeaks announcing the opening of the Public Library of U.S. Diplomacy, with its first collection consisting of more than 1.7 million diplomatic cables from 1973 to ’76.
All of the material was routinely (if belatedly) declassified after 25 years, per U.S. law, and has been available from the National Archives and Records Administration. WikiLeaks made the collection searchable and is “housing” it on servers presumably beyond the reach of Big Brother. Now they can’t be reclassified.
As announcements from WikiLeaks go, it’s all fairly underwhelming. But it does make an important revelation -- however unintentional -- by reminding the public that three years have passed since the group last made a world-shaking release of information. The leaks, it seems, have been plugged. Secret documents are staying secret. Even the most ardent admirer of Bradley Manning will be understandably reluctant to share his fate. While it is too soon to pronounce WikiLeaks dead, it does appear to be in a coma.
Castronovo, a professor of English and American studies at the University of Wisconsin at Madison, links the “Cablegate” of 2010 to a Revolutionary War-era incident through the concept of “a new kind of network actor” distinct from “the traditional person of liberal democracy.” The case in question was the Thomas Hutchinson affair of 1773, when letters by the governor of the Massachusetts Bay Colony somehow found their way into the hands of the Sons of Liberty, who then circulated them via newspaper and pamphlet.
Hutchinson had borne the brunt of serving His Majesty during the Stamp Act riots a few years earlier, and was in office during the Boston Massacre. In his correspondence he referred to the need for “abridgement of what are called English liberties" among the unruly colonial subjects, which was just so much gasoline on the fire.
The source of the leak was one Benjamin Franklin, colonial postmaster. Franklin later insisted that this ethical lapse was committed in an attempt (alas! unsuccessful) to reduce American hostility towards Parliament and the Crown by documenting that the real source of trouble was someone much lower in the chain of command. Castronovo treats this claim with greater suspicion than have some historians -- and not just because Franklin was such a master of irony, pseudonymous commentary, and the fake-out.
Franklin was also a node in multiple correspondence networks, and understood perfectly well how porous they could be. Alongside the official channels of communication between Court and colony, there were informal but durable long-distance connections among merchants, officials, publishers, and so on. A letter by someone within such a network tended to have, so to speak, an implicit “cc” or “bcc” field.
“More significant than the sending and receipt of private letters between individuals,” writes Castronovo, the activity of these epistolary networks “encompassed a range of public activities, including the recitation of letters aloud, the printing of handwritten letters in newspapers, the transmission of pamphlets, and the sending of circular letters by local governments....” Such communications might be “opened by third parties and forwarded without permission, shared in social circles and reprinted in newspapers.”
By transmitting Hutchinson’s letters to figures within his own circles who were in contact with the more hot-headed American revolutionary circles, Franklin was creating a political weapon against the authorities. He was, in effect, both a whistleblower and Julian Assange at the same time.
Having put it that way, however, I must immediately backtrack to say that the analogy is not Castronovo’s point at all. “At issue,” he writes, “is how communication spreads and metastasizes, how ideas proliferate and take root, how views and opinions propagate themselves.”
The network in each case – epistolary or digital – is not just a medium or tool that individuals use to communicate or act. In it, rather, “individual agency becomes unmoored from stable locations and is set adrift along an interconnected web of tendril-like links and nodes.” This is a perspective derived from the work of Bruno Latour, among others. It rejects the familiar way of thinking of society as consisting of distinct individuals who interact and so create networks. Instead -- to put things one way – it’s networks all the way down. Society emerges from a teeming array of networks that overlap and intersect, that get knotted together or fray with use.
Franklin’s catalytic intervention in the American crisis of 1773 was as effective as it was by virtue of his ability to channel communication from one network to another. And it was effective because it was done quietly; he advanced the revolutionary process involving “a public interlinked and excited by expressions of dissent” without making himself known. “In a perhaps uncharacteristic move,” Castronovo says, “Franklin refuses to occupy the center [of public discussion], instead preferring to sit back in the shadows where, after all, the shadowy work of espionage gets done.”
But the state – however much it may use networks of its own – insists on ascribing public action to individuals possessing stable and legible identities. By 1774, the Privy Council knew about Franklin’s role in the matter and summoned him to a hearing in London, where he was denounced, in humiliating terms, for more than an hour.
Bradley Manning, of course, faces worse – while the coiner of that witticism about operating illegally and unconstitutionally has never endured the consequences of his actions. What does that imply for a Latourian theory of social ontology? I don’t know, but it surely demonstrates that not all networks are equal before the law.
Whether we're slaving over a scholarly article or a textbook, or knocking off streams of memos and e-mails, virtually all of us write constantly -- and we can do it better and more meaningfully, Mike Rose argues.
With so much focus on higher education's obligations to job preparation, the humanities are perpetually playing defense, especially in public higher education. We academic defenders of the humanities generally take one of two lines: we argue that 1) our majors ARE work force preparation -- we develop strong analytical skills, good writing, problem-solving, etc., or 2) we have no need to justify what we teach because the value of the humanities, the study of what makes us human, is self-evident.
These arguments over the value of degrees in the humanities run parallel to a set of arguments I find myself making as part of a role I occupy, as a board member for my state council for the humanities. The National Endowment for the Humanities allocates about a third of its funding through the state councils, and the councils in turn fund humanities initiatives at the state level.
State humanities councils such as mine (Rhode Island's) re-grant our NEH allocation as well as the money we raise locally to community humanities projects. We've funded research on communities of Cape Verdean longshoremen in Providence, oral histories of Second World War vets in hospice care, talk-back events at local theaters, seashore sound archives, a documentary film about a female 19th-century life-saving lighthouse-keeper, and lots of fascinating digital work, from archiving to app development. All the projects must involve humanities scholars — some of those scholars are affiliated with universities, and others aren’t. All of it aims at helping Rhode Islanders to understand ourselves, our histories, and our many cultures.
When economic times are tough, an agency such as the NEH is vulnerable unless legislators understand and value the role of the humanities in a strong democracy -- just as university humanities programs are vulnerable in state funding contexts when legislators, boards of trustees, or voters don't have a clear understanding of the value of the humanities in the culture and in the workplace.
In a career spent in higher education in the humanities, most of it at a liberal arts college, I rarely had to justify teaching what I taught. The value of an English major was self-evident to my colleagues and my students. Sure, the occasional parent would squeak, "But how will she make a living?" But I never hesitated to reassure the anxious check-writers of the value of our product. Having worked in the worlds of both journalism and Washington nonprofits, I knew how many good jobs demanded only a bachelor's degree, writing skills, research and analytic abilities, and common sense.
But then came the Great Recession and what many are calling the end of the higher education bubble. Questions about tuition increases, student debt, and colleges' lack of accountability (that is, the paucity of data on employment for recent graduates) get attached, in public perception, to the unemployment rate and to a re-emergence of the old post-Sputnik fears that the nation is not training enough folks in STEM fields.
Organizations such at the Association of American Colleges and Universities have been proactive in making the case for liberal learning as preparation for good citizenship, pointing to its employers' surveys. They have found that employers believe that the skills colleges should focus on improving are: written and oral communication; critical thinking and analytic reasoning; the application of knowledge and skills in real-world settings; complex problem solving; ethical decision making, and teamwork skills. These skills are not exclusive to the humanities, but they certainly line up with the student learning outcomes in humanities instruction at my institution.
It's not as if defenders of the values of a liberal arts education are ignoring economic realities: many liberal arts colleges are adding business majors, humanities fields are requiring internships and experiential learning, and colleges and universities are scrambling to make contact with successful alumni and to gather post-graduation employment data.
There's nothing wrong with linking liberal arts education in general, and the humanities in particular, to work. The humanities are inextricably linked to work and to U.S. civic life. When Lyndon Johnson signed legislation to bring the NEH into existence in 1965, it was in a context in which the federal government was pushed to invest in culture, as it had in science. NEH's account of its own history explains that the head of the Atomic Energy Commission told a Senate committee: "We cannot afford to drift physically, morally, or esthetically in a world in which the current moves so rapidly perhaps toward an abyss. Science and technology are providing us with the means to travel swiftly. But what course do we take? This is the question that no computer can answer."
Through my role in public humanities, I have come to understand that the humanities are what allow us to see ourselves as members of a civic community. Public history, public art, shared cultural experiences make us members of communities. This link has not been stressed enough in defense of the academic humanities. It's past time to make this important connection -- to help our boards of trustees, our communities, and our legislators to know what the humanities brings to civil society and gives to students as they enter the workforce.
In the first class I ever taught as a teaching assistant, I did my first lecture on Death of a Salesman. My topic was work -- how Willy's job is his identity. I pointed to a student I knew in the 150-student lecture hall and told him that his surname, Scribner, probably indicated the employment of some ancestor of his, a "scrivener," like Bartleby. Then I asked who else had last names that might have indicated a job. We had Millers and Coopers and Smiths, and many more.
When those students' ancestors worked as barrel-makers or at their forges, they worked those jobs for life, and their sons afterward did the same. But how many of us do the job our parents did? How many of our students will do the same job in their 30s that they will do in their 20s? Narrow ideas about work force preparation will not prepare our students for the work of the rest of their lives. Each job they take will train them in the skills they need to succeed in that particular industry. But a broad, liberal education will have been what made them people worth hiring, people who have learned the value of curiosity, initiative, problem-solving. Students in STEM fields and students in arts, social sciences, and humanities all will become members of communities, and a good background in the humanities will enrich their membership.
I loved the humanities as an English professor. But it was only when I became involved in public humanities that I began to understand their value not just for individuals but for communities. That's the public good. And that's why we cannot afford to let a narrow rhetoric of work force preparation push the humanities from our curriculums or defund the work of the National Endowment for the Humanities.
Paula M. Krebs is dean of the College of Humanities and Social Sciences at Bridgewater State University, in Massachusetts, and a member of the board of directors of the Rhode Island Council of the Humanities.
Teaching with PowerPoint has been an exercise in frustration for me. I find that my course preparation takes twice as long as it should, and the results are more often than not unsatisfying. It also makes me feel muffled and absent from the classroom. Maybe this is a function of my poor PowerPoint form, of being a latecomer to a technology that younger faculty use with more ease and panache. In a way, it’s not surprising that I would struggle with it. Although I’m young and pretty tech-savvy at 43, I can’t associate PowerPoint with my lived experiences as a learner. I spent my whole life as a student, from kindergarten through graduate school, plucking words out of the air to put them in my notebook, or following along as my teachers scribbled on the blackboard. The most technology-forward moments involved the occasional projection of transparencies in science classes.
Last semester I decided to conduct an experiment. For years, even before becoming a PowerPoint user, my chalkboard form had suffered from a lack of discipline and focus. What if I really rededicated myself to it? I decided to make writing on the chalkboard my primary method and PowerPoint my secondary tool. The outcome of the exercise was fantastic. I felt like I was waking up from being half-asleep as a teacher.
One of the things I liked the most about the experience was how using the chalkboard freed me to be more responsive to the needs of my students. Although I always came to class with an outline of notes to write on the board, I knew that it was changeable and schematic, subject to revision by student comments and questions. If you compared my paper notes with what actually went on the chalkboard you’d discover all kinds of emendations and additions. The chalkboard encouraged me to be more attentive to classroom conversations, to be more confident about changing my script.
Using the chalkboard also encouraged me to package or process information for my students in more versatile ways. I could come to class and write bullet points on the board as a starting point, then while interacting with my students, proceed to annotate with symbols (asterisks, arrows, underlining). If they still didn’t get it, I could erase and diagram, or erase and do a flow chart. The chalkboard is dynamic, changeable, sensitive, immediate, and completely in the classroom moment. It models note taking and underlines the value of trial and error thinking and brainstorming, skills that are vital to analytical thinking.
I also appreciated the chalkboard because it is an embodied kind of learning. It synchs the bodies of the students to the movement of the body of the instructor. The fact that there is no PowerPoint file to download or pass out, and that the eraser is eventually coming around, means that the class gets in a rhythm of following the movements of the instructor. There is a ritual of collective focus and activity. The instructor has to be much more physically present because writing on the chalkboard requires choreography, gesture and tempo. This is of practical value but there’s also something deeper. In an existence increasingly defined by the virtual, it is important to reassert physical presence.
At the end of class, I sometimes looked at the board before erasing it. So this is what had happened in class in the last hour! I could see the vague outlines of my original plan overlaid with symbols of emphasis and additions that had emerged through classroom conversations. Here it was: the exciting record of a collaborative enterprise between teacher and students. The board recorded an event that could never be repeated in precisely the same way, even if I used the same notes to try to do so.
All of this may seem ridiculous if you teach in a pedagogical ecosystem where chalkboards are still prominent. On my campus, it seems like everyone uses PowerPoint. The situation is so pervasive that once I noticed that student pens only went up when the PowerPoint was projected on screen. If I wrote a series of items on the board, not very many students wrote them down. In their minds, PowerPoint was the chalkboard and the chalkboard was just a piece of furniture. All my colleagues, in talking about course preparation, use the word PowerPoint: I was up late preparing my PowerPoints … I left my PowerPoint at home … I couldn’t finish my PowerPoint today in class.
In my circles you can’t use the word "blackboard" as a synonym for chalkboard because everyone will assume you’re referring to our learning management system. This last detail is probably the most symbolically telling: in spite of hundreds of years of use, and its iconic stature as a symbol of the classroom, the word "blackboard" has been hollowed out by a corporation.
The problem with educational technology when it becomes institutionalized and naturalized is that it easily becomes a crutch rather than an instrument to enhance community and interaction between human being. What is brilliant about José Bowen’s well known "Teaching Naked" concept is that it affirms technology as a tool for enhancing a humanistic classroom interaction. Interest in PechaKucha and Prezi, screen projection formats and templates that discard the stale formulas of conventional PowerPoint, underscores that instructors and presenters everywhere recognize that we need to allow for creativity and responsiveness in our use of educational technology. We are at our best as teachers when we question the tools we are given and reinvent them. This happens everyday in thousands of classrooms when innovative teachers bend PowerPoint to their will, instead of the opposite. The real software behind any instructional technology is the instructor; don’t underestimate her ability to elevate a rudimentary tool or ruin a promising and sophisticate one.
I’m not arguing against PowerPoint tout court. Heck, I plan on continuing to use it as one tool among others. I am just suggesting that the old chalkboard still has something to teach us. If you haven’t tried it recently, you should. It’s the latest thing and you don’t have to plug it into an outlet or find a network to use it.
Christopher Conway is associate professor of modern languages at the University of Texas at Arlington, where he teaches courses in modern Latin American literature and culture.