Tamerlan Tsaernev was in my College Writing I class at Bunker Hill Community College in the spring of 2007. My pinhole view of his life, including a couple of e-mails about why he missed some classes, adds nothing to either the pathological or the geopolitical debates about the bombs Tamerlan and his brother are accused of setting off two weeks ago at the 2013 Boston Marathon.
What I can tell you is that I’ve felt like crying most of the time since Bloody Friday, the Friday after the Marathon Monday bombings that killed three and wounded 264, when police shut down Boston and Cambridge. Disclaimer 1: Of course the dead and the injured and their families are the only focus of our love and prayers. I have no words. This is a column about education reform -– or the lack of it.
Everyone I know of every profession in Boston reported feeling about the same. I now know that these feelings have a name: Secondary Trauma. You don’t have to be one of the injured to feel numb or want to cry.
How to treat myself for secondary trauma? I had no idea that was a skill I'd learn and need at a community college.
Hydrate – lots of water. Fresh air. No caffeine. Breathe. Have a good cry. J.S. Bach, always. Keep in mind that the national policy debate about the central issue for community colleges, completion, makes no mention I’ve heard of secondary trauma expertise as necessary professional development. Here’s my bookmarked reference web site, Trauma Stewardship.
Here’s my list of student primary traumas I’ve been second to, in a few short years: murder, rape, shootings; sudden and prolonged homelessness; memories of wars in Somalia, Eritrea, El Salvador, the Congo; a father killed in the civil war in Mali; a student for whom I was buying a sandwich at 5 p.m. saying, “I guess you could tell I haven’t eaten since yesterday.” Domestic violence. Stories from veterans of the wars in Iraq and Afghanistan. All but a few arise from teaching, remember, College Writing I. To this list, I can now add a terrorist attack. Perhaps ribbons for each trauma, as in the military, would cause the completion critics to include consider trauma a factor.
Let me be perfectly clear. Withering completion accountability is fine by me. The solutions just need a load factor for the days that community college teachers need a good cry.
Disclaimer 2: The worst days of my own silver-, no, platinum-spooned life are miles from the everyday trauma of the millions of students in community colleges, and the secondary traumas of their professors. I do not teach full-time. With occasional slippage, I am a generally happy and optimistic person. I have family, friends, health and, more, health insurance, food, Dana-Farber Cancer Institute and the love of Friends Meeting Cambridge for three years of cancer that my wife survived. (Thank you STEM disciplines.) My trauma requires no help.
My point for this column is that at the nation’s 1,200 community colleges, thousands of instructors have a traditional workload, unopposed by any of our unions, of four and five classes a semester with classes of 20, 30 and more students all subject to the primary traumas I’ve described.
I have no words for how these colleagues survive. I have plenty of words, for another day, for the policy makers, legislators, trade associations, and union chiefs who won’t admit to these traumas while whining about low community college completion rates.
The 1 a.m. Friday bomb explosion and shootout that killed Tamerlan was about a mile from my home. My wife heard the bomb and the gunfire that I slept through. By morning, Cambridge was shut down, and we were ordered to stay at home. After a day with helicopters chopping overhead and Doppler-effecting sirens in all directions, my wife and daughter heard the shooting Friday evening when police arrested Tamerlan’s brother, again about a mile from our home. I didn’t hear the gunfire.
I’ve discovered I am learning, too, about relative secondary trauma rankings on my Emotional/Trauma Richter Scale (patent pending). What I can tell you is that my urge to cry last week, and even now, is higher by a bit on my E/T Richter scale reading than when Cedirick Steele, a student in that same class that spring of 2007, was shot seven times and killed. I learned Cedirick’s death was called a premeditated random murder. The shooters planned to kill someone, it didn’t matter who. Perhaps tertiary trauma is when we discover a new term for something too terrible to be true. (Click here for my report on Cedirick’s killing.)
Here’s what I don’t understand in my rankings. I knew Cedirick very well. I wouldn’t have recognized Tamerlan on the street. He missed most classes and didn’t complete the course. Why I do I feel sadder after Bloody Friday than I did right after Cedirick’s death?
I didn’t make the Tamerlan connection until late Friday morning. I hadn’t known the suspects’ names when I went to bed Thursday. The cat woke me up Friday morning about 5:30 a.m. with a left paw “Breakfast!” to the nose.
I let the dog out in the yard and looked out the front door. No newspaper. Odd but ok. I fed the cats, made coffee, changed the laundry, put out breakfast for my wife. Still no newspaper. Not ok. Another 15 minutes, and I would call in the missed delivery. I had another cup of coffee and read a book. My wife was asleep. I hadn’t turned on the radio. Still no paper.
Then, the day began. A text message from someone at work. “The MBTA is closed. How can I get to work? Do you know what’s going on?” I had no idea. Another text message. Bunker Hill Community College closed for citywide emergency. I turned on the radio and learned why no newspaper delivery that morning. My neighborhood was the news. Police were looking for the suspects right here. And the news said that one of the suspects had gone to Bunker Hill Community College.
In the next hour, friends e-mailed. Did I know this student? “No,” I said. After the third e-mail, something stirred. I put “Tamerlan” in the search box of my computer. There he was on a class list from 2007, along with two innocuous e-mails about missing class. As a comedy and to raise money for students like mine, two years ago, I ran -– well, completed, the Boston Marathon. (My report.) Oh, can I see the blocks to the finish line where the bombs went off. I guess all this factors into my E/T Richter Scale, terrorist bombing versus premeditated random murder.
Now, the Iraq tank-driving student in that same class graduated from Dartmouth last spring, and he is on his plan, teaching at-risk high school students.
Of course that cheers us up on a bad day. We, the people, have to chuck the way we mistake such stories for success. Along with head-in-the-sand union chiefs, policy makers and too many education trade associations, do we let ourselves believe that these feel-good, albeit individually triumphant, community college to Ivy League stories are progress? I did, for years.
Back to my secondary trauma professional development. Our refusal as a nation to face down the truth about the lives of so many students and their traumas every day in so many of our schools and colleges? The trauma professionals would call our refusal denial and avoidance. An unhealthy strategy.
On the E/T Richter scale, though, my urge to cry was lower this week than it was back in 2011, when I was called to testify at the third trial of the Cedirick’s murderers. (Click here for my report on the trial.) On the morning of my testimony, the Suffolk County Victim/Witness Advocate sat me down and asked how I felt. Did she really want to know? She did. I said I’d felt like crying about Cedirick every day since she’d called three weeks before, to ask me to testify. Normal, she said. My education on secondary trauma began. After the trial, she made me go see a trauma counselor.
After the trial, four years after Cedirick’s random, premeditated murder, at last, I had a good cry. Today, I’ll help any student I can. And I’ll say a prayer again, and again, for the three dead and the 264 injured at the Boston Marathon Massacre.
Wick Sloane writes the Devil's Workshop column for Inside Higher Ed. Follow him on Twitter at @WickSloane.
"Mad Men" returns to cable television this coming Sunday, continuing its saga of mutable identities and creative branding at a New York advertising firm during the 1960s. Or at least one assumes it will still be set in the ‘60s. How much narrative time lapses between seasons varies unpredictably. Like everything else about the show, it remains the network’s closely guarded secret. Critics given an early look at the program must agree to an embargo on anything they publish about it. This makes perfect sense in the context of the social world of "Mad Men" itself: the network is, after all, selling the audience’s curiosity to advertisers.
A different economy of attention operates in Mad Men, Mad World: Sex, Politics, Style & the 1960s, a collection of 18 essays on the program just published by Duke University Press. It’s not just a matter of the editors and contributors all being academics, hence presumably a different sort of cultural consumer from that of the average viewer. On the contrary, I think that is exactly wrong. Serialized narrative has to generate in its audience the desire for an answer to a single, crucial question: “And then what happens?” (Think of all the readers gathered at the docks in New York to get the latest installment of a Dickens novel coming from London.)
Of course, the contributors to Mad Men, Mad World write with a host of more complex questions in mind, but I don’t doubt for a second that many of the papers were initially inspired by weekend-long diegetic binge sessions, fueled by the same desire driving other viewers. At the same time, there’s every reason to think that the wider public is just as interested in the complex questions raised by the show as any of the professors writing about it. For they are questions are about race, class, gender, sexuality, politics, money, happiness, misery, and lifestyle – and about how much any configuration of these things can change, or fail to change, over time.
Many of the essays serve as replies to a backlash against "Mad Men" that began in the third or fourth season, circa 2009, as it was beginning to draw a much larger audience than it had until that point. The complaint was that the show, despite its fanatical attention to the style, dress, and décor of the period, was simple-mindedly 21st century in its attitude toward the characters. It showed a world in which blunt expressions of racism, misogyny, and homophobia were normal, and sexual harassment in the workplace was an executive perk. Men wore hats and women stayed home. Everyone smoked like a chimney and drank like a fish, often at the same time. Child abuse was casual. So was littering.
And because all of it was presented in tones by turn ironic and horrified, viewers were implicitly invited to congratulate themselves on how enlightened they were now. Another criticism held that "Mad Men" only seemed to criticize the oppressive arrangements it portrayed, while in reality allowing the viewer to enjoy them vicariously. These complaints sound contradictory: the show either moralistically condemns its characters or inspires the audience to wallow in political incorrectness. But they aren’t mutually exclusive by any means. What E.P. Thompson called “the enormous condescension of posterity” tends to be a default setting with Americans, alternating with periods of maudlin nostalgia. There’s no reason the audience couldn’t feel both about the "Mad Men" vision of the past.
See also a comment by the late Christopher Lasch, some 20 years ago: “Nostalgia is superficially loving in its re-creation of the past, but it invokes the past only to bury it alive. It shares with the belief in progress, to which it is only superficially opposed, an eagerness to proclaim the death of the past and to deny history’s hold on the present.”
At the risk of conflating too many arguments under too narrow a heading, I’d say that the contributors to Mad Men, Mad World agree with Lasch’s assessment of progress and nostalgia while also demonstrating how little it applies to the program as a whole.
Caroline Levine’s essay “The Shock of the Banal: Mad Men's Progressive Realism” provides an especially apt description of how the show works to create a distinct relationship between past and present that’s neither simply nostalgic nor a celebration of how far we’ve come. The dynamic of "Mad Men" is, in her terms, “the play of familiarity in strangeness” that comes from seeing “our everyday assumptions just far enough removed from us to feel distant.” (Levine is a professor of English at the University of Wisconsin at Madison.)
The infamous Draper family picnic in season two is a case in point. After a pleasant afternoon with the kids in a bucolic setting, the parents pack up their gear, shake all the garbage off their picnic blanket, and drive off. The scene is funny, in the way appalling behavior can sometimes be, but it’s also disturbing. The actions are so natural and careless – so thoughtless, all across the board – that you recognize them immediately as habit. Today’s viewers might congratulate themselves for at least feeling guilty when they litter. But that’s not the only possible response, because the scene creates an uneasy awareness that once-familiar, “normal” ideas and actions came to be completely unacceptable – within, in fact, a relatively short time. It eventually became the butt of jokes, but the famous “Keep America Beautiful” ad from about 1970 -- the one with the crying Indian -- probably had a lot to do with it. (Such is the power of advertising.)
The show's handling of race and gender can be intriguing and frustrating. All the powerful people in it are straight white guys in ties, sublimely oblivious to even the possibility that their word might not be law. "Mad Space" by Dianne Harris, a professor of architecture and art history at the University of Illinois at Urbana-Champaign, offers a useful cognitive map of the show's world -- highlighting how the advertising firm's offices are organized to demonstrate and reinforce the power of the executives over access to the female employees' labor (and, often enough, bodies), while the staid home that Don Draper and his family occupy in the suburbs is tightly linked to the upper-middle-class WASP identity he is trying to create for himself by concealing and obliterating his rural, "white trash" origins. A handful of African-American characters appear on the margins of various storylines -- and one, the Drapers' housekeeper Carla, occupies the especially complex and fraught position best summed up in the phrase "almost part of the family." But we never see the private lives of any nonwhite character.
In "Representing the Mad Margins of the Early 1960s: Northern Civil Rights and the Blues Idiom," Clarence Lang, an associate professor of African and African-American studies at the University of Kansas, writes that "Mad Men" "indulges in a selective forgetfulness" by "presuming a black Northern quietude that did not exist" (in contrast to the show's occasional references to the civil rights movement below the Mason-Dixon line). Lang's judgment here is valid -- up to a point. As it happens, all of the essays in the collection were written before the start of the fifth season, in which black activists demonstrate outside the firm's building to protest the lack of job opportunities. Sterling Cooper Draper Pryce hires its first African-American employee, a secretary named Dawn. I think a compelling reading of"Mad Men"would recognize that the pace and extent of the appearance of nonwhite characters on screen is a matter not of the creators' refusal to portray them, but of their slow arrival on the scene of an incredibly exclusionary social world being transformed (gradually and never thoroughly) by the times in which "Mad Men" is set.
There is much else in the book that I found interesting and useful in thinking about "Mad Men," and I think it will be stimulating to readers outside the ranks of aca fandom. I’ll return to it in a few weeks, with an eye to connecting some of the essays to new developments at Sterling Cooper Draper Pryce. (Presumably the firm will have changed its name in the new season, given the tragic aftermath of Lane Pryce’s venture in creative bookkeeping.)
When things left off, it was the summer of 1967. I have no better idea than any one else when or how the narrative will pick up, but really hope that Don Draper creates the ad campaign for Richard Nixon.
Of the many strange things in Gulliver’s Travels that make it hard to believe anyone ever considered it a children’s book, the most disturbing must be the Struldbruggs, living in the far eastern kingdom of Luggnagg, not covered by Google Maps at the present time.
Gulliver’s hosts among the Luggnaggian aristocracy tell him that a baby is born among them, every so often, with a red dot on the forehead -- the sign that he or she is a Struldbrugg, meaning an immortal. Our narrator is suitably amazed. The Struldbruggs, he thinks, have won the cosmic lottery. Being “born exempt from that universal Calamity of human Nature,” they “have their Minds free and disengaged, without the Weight and Depression of Spirits caused by the continual Apprehension of Death.”
The traveler has no trouble imagining the life he might lead as an immortal, given the chance. First of all, Gulliver tells his audience at dinner, he would spend a couple of hundred years accumulating the largest fortune in the land. He’d also be sure to master all of the arts and sciences, presumably in his spare time. And then, with all of that out of the way, Gulliver could lead the life of a philanthropic sage, dispensing riches and wisdom to generation after generation. (A psychoanalytic writer somewhere uses the expression “fantasies of the empowered self,” which just about covers it.)
But then the Lubnaggians bring him back to reality by explaining that eternal life is not the same thing as eternal youth. The Struldbruggs “commonly acted like Mortals, till about thirty Years old,” one of Gulliver’s hosts explains, “after which by degrees they grew melancholy and dejected, increasing in both till they came to four-score." The expression “midlife crisis” is not quite the one we want here, but close enough.
From the age of eighty on, “they had not only all the Follies and Infirmities of other old Men, but many more which arose from the dreadful Prospect of never dying.” Forget the mellow ripening of wisdom: Struldbruggs “were not only Opinionative, Peevish, Covetous, Morose, Vain, Talkative, but incapable of Friendship, and dead to all natural Affection.”
It gets worse. Their hair and teeth fall out. “The Diseases they were subject to still continuing without increasing or diminishing,” Gulliver tells us. “In talking they forgot the common Appellation of Things, and the Names of Persons, even of those who are their nearest Friends and Relations. For the same Reason they never can amuse themselves with reading, because their Memory will not serve to carry them from the beginning of a Sentence to the end; and by this Defect they are deprived of the only entertainment whereof they might otherwise be capable.”
It is a vision of hell. Either that, or a prophecy of things to come, assuming the trends of the last few decades continue. Between 1900 and 2000, the average life expectancy in the United States rose from 49 to 77 years; between 1997 and 2007, it grew by 1.4 years. This is not immortality, but it beats dying before you reach 50. The span of active life has extended as well. The boundary markers of what counts as old age keep moving out.
From a naïve, Gulliverian perspective, it is all to the good. But there’s no way to quantify changes in the quality of life. We live longer, but it's taking longer to die as well. Two-thirds of deaths among people over the age of 65 in the United States are caused by three chronic conditions: heart disease, cancer, and stroke. The “life” of someone in a persistent vegetative state (in which damage to the cerebral cortex is so severe and irreversible that cognitive functions are gone for good) can be prolonged indefinitely, if not forever.
More horrific to imagine is the twilight state of being almost vegetative, but not quite, with some spark of consciousness flickering in and out -- a condition of Struldbruggian helplessness and decay. “I grew heartily ashamed of the pleasing Visions I had formed,” says Gulliver, “and thought no Tyrant could invent a Death into which I would not run with Pleasure from such a Life.”
Howard Ball’s book At Liberty to Die: The Battle for Death With Dignity in America (New York University Press) is a work of advocacy, as the subtitle indicates. The reader will find not the slightest trace of Swiftian irony in it. Ball, a professor emeritus of political science at the University of Vermont, is very straightforward about expressing bitterness -- directing it at forces that would deny “strong-willed, competent, and dying adults who want to die with dignity when faced with a terminal illness” their right to do so.
The forces in question fall under three broad headings. One is the religious right, which Ball sees as being led, on this issue at least, by the Roman Catholic Church. Another is the Republican Party leadership, particularly in Congress, which he treats as consciously “politicizing the right-to-die issue” in a cynical manner, as exemplified by the memo of a G.O.P. operative on “the political advantage to Republicans [of] intervening in the case of Terri Schiavo.” (For anyone lucky enough to have forgotten: In 1998, after Terri Shiavo had been in a persistent vegetative state for eight years, her husband sought to have her feeding tube removed, setting off numerous rounds of litigation, as well as several pieces of legislation that included bills in the US Congress. The feeding tube was taken out and then reattached twice before being finally removed in 2005, after which Schiavo died. The website of the University of Miami's ethics program has a detailed timeline of the Schiavo case.)
The third force Ball identifies is that proverbial 800-pound gorilla known as the Supreme Court of the United States. Its rulings in Washington v. Glucksburg and Vacco v. Quill in 1997 denied the existence of anything like a constitutionally protected right to physician-assisted death (PAD). States can outlaw PAD -- or permit it, as Montana, Oregon, and Washington do at present. In the epigraph to his final chapter, Ball quotes a Colorado activist named Barbara Coombs Lee: “We think the citizens of all fifty states deserve death with dignity.” But the Supreme Court of the United States will not be making that a priority any time soon.
“The central thesis of the book,” states Ball, “is that the liberty found in the U.S. Constitution’s Fifth and Fourteenth ‘Due Process’ Amendments extends... [to] the terminally ill person's right to choose to die with dignity -- with the passive assistance of a physician -- rather than live in great pain or live a quality-less life.” The typical mode of “passive assistance” would be “to give painkilling medications to a terminally ill patient, with the possibility that the treatment will indirectly hasten the patient’s death.”
Ball notes that a Pew Research Center Survey from 2005 showed that an impressive 84 percent level of respondents “approved of patients being able to decide whether or not to be kept alive through medical treatment or choosing to die with dignity.”
Now, for whatever it’s worth, that solid majority of 84 percent includes this columnist. If the time for it ever comes, I’d want my doctor to supersize me on the morphine drip without breaking any laws. Throughout Ball's narrative of the successes and setbacks of the death-with-dignity cause, I cheered at each step forward, and felt appalled all over again while reading the chapter he calls “Terri Schiavo’s Tragic Odyssey,” although it did seem like the more suitable adjective would be “grotesque.” Tragedy implies at least some level of dignity.
The author also introduced me to a blackly humorous expression, “death tourist,” which refers to a person "visiting" a state to take advantage of physician-assisted suicide being legal there.
As a member of the choir, I liked Ball's preaching, but it felt like the sermon was missing an index card or two. As mentioned earlier, the book’s “central thesis” is supposed to be that the due-process guarantees in Constitution extend to the right to death with dignity. And so the reader has every reason to expect a sustained and careful argument for why that legal standard applies. None is forthcoming. The due-process clauses did come up when the Supreme Court heard oral arguments in 1997, but it rejected them as inapplicable. This would seem to be the point in the story where that central thesis would come out swinging. The author would show, clearly and sharply, why the Court was wrong to do so. He doesn't. It is puzzling.
Again, it sounds very categorical when Ball cites that Pew survey from 2005 showing 84 percent agreement that individuals had a right to choose an exit strategy if medical care were not giving them a life they felt worth living. But the same survey results show that when asked whether they believed it should be legal for doctors to "assist terminally ill patients in committing suicide," only 44 percent favored it, while 48 percent were opposed. With the matter phrased differently -- surveyors asking if it should be legal for doctors to "give terminally ill patients the means to end their lives" -- support went up to 51 percent, while 40 percent remained opposed. This reveals considerably more ambivalence than the 84 percent figure would suggest.
The notion that a slippery slope will lead from death with dignity to mass programs of euthanasia clearly exasperates Ball, and he can hardly be faulted on that score. A portion of the adult population is prepared to believe that any given social change will cause the second coming of the Third Reich, this time on American soil. (Those who do not forget the History Channel are condemned to repeat fairly dumb analogies.) But the slippery-slope argument will more likely be refuted in practice than through argument. Whether or not the law recognizes it, the right to make decisions about one’s own mortality or quality of life will exist any time someone claims it. One of the medical profession’s worst-kept secrets for some time now is that plenty of physicians will oblige a suffering patient with the means to end their struggle. (As Ball notes, this came up in the Supreme Court discussions 15 years ago.)
And the demand is bound to grow, as more and more of us live long enough to see -- like Gulliver -- that there are worse fates than death. Brilliant legal minds should apply themselves to figuring out how to make an ironclad case for the right a decent departure from this mortal coil. At Liberty to Die is useful as a survey of some obstacles standing in the way. But in the meantime, people will find ways around those obstacles, even if it means taking a one-way, cross-continental trip to the Pacific Northwest. There are worse ways to go.