Trigger warnings in the classroom have been the subject of tremendous debate in recent weeks, but it’s striking how little the discussion has contemplated what actual trigger warnings in actual classrooms might plausibly look like.
The debate began with demands for trigger warnings by student governments with no power to compel them and suggestions by administrators (made and retracted) that faculty consider them. From there the ball was picked up mostly by observers outside higher ed who presented various arguments for and against, and by professors who repudiated the whole idea.
What we haven’t heard much of so far are the voices of professors who are sympathetic to the idea of such warnings talking about what they might look like and how they might operate.
As it turns out, I’m one of those professors, and I think that discussion is long overdue. I teach history at Hostos Community College of the City University of New York, and starting this summer I’m going to be including a trigger warning in my syllabus.
I’d like to say a few things about why.
An Alternative Point of View
Seven humanities professors offer
10 reasons that "trigger warnings"
are counterproductive. Read more.
To start off, I think it’s important to be clear about what trigger warnings are, and what purpose they’re intended to serve. Such warnings are often framed — and not just by critics — as a “you may not want to read this” notice, one that’s directed specifically at survivors of trauma. But their actual purpose is considerably broader.
Part of the confusion arises from the word “trigger” itself. Originating in the psychological literature, the term can be misleading in a non-clinical context, and indeed many people who favor such warnings prefer to call them “content warnings” for that reason. It’s not just trauma survivors who may be distracted or derailed by shocking or troubling material, after all. It’s any of us, and a significant part of the distraction comes not from the material itself but from the context in which it’s presented.
In the original cut of the 1933 version of the film "King Kong," there was a scene (depicting an attack by a giant spider) that was so graphic that the director removed it before release. He took it out, it’s said, not because of concerns about excessive violence, but because the intensity of the scene ruined the movie — once you saw the sailors get eaten by the spider, the rest of the film passed by you in a haze.
A similar concern provides a big part of the impetus for content warnings. These warnings prepare the reader for what’s coming, so their attention isn’t hijacked when it arrives. Even a pleasant surprise can be distracting, and if the surprise is unpleasant the distraction will be that much more severe.
I write quite a bit online, and I hardly ever use content warnings myself. I respect the impulse to provide them, but in my experience a well-written title and lead paragraph can usually do the job more effectively and less obtrusively.
A classroom environment is different, though, for a few reasons. First, it’s a shared space — for the 75 minutes of the class session and the 15 weeks of the semester, we’re pretty much all stuck with one another, and that fact imposes interpersonal obligations on us that don’t exist between writer and reader. Second, it’s an interactive space — it’s a conversation, not a monologue, and I have a responsibility to encourage that conversation as best I can. Finally, it’s an unpredictable space — a lot of my students have never previously encountered some of the material we cover in my classes, or haven’t encountered it in the way it’s taught at the college level, and don’t have any clear sense of what to expect.
For all these reasons, I’ve concluded that it would be sound pedagogy for me to give my students notice about some of the challenging material we’ll be covering in class — material relating to racial and sexual oppression, for instance, and to ethnic and religious conflict — as well as some information about their rights and responsibilities in responding to it. Starting with the summer semester, as a result, I’ll be discussing these issues during the first class meeting and including a notice about them in the syllabus.
My current draft of that notice reads as follows:
Course Content Note
At times this semester we will be discussing historical events that may be disturbing, even traumatizing, to some students. If you ever feel the need to step outside during one of these discussions, either for a short time or for the rest of the class session, you may always do so without academic penalty. (You will, however, be responsible for any material you miss. If you do leave the room for a significant time, please make arrangements to get notes from another student or see me individually.)
If you ever wish to discuss your personal reactions to this material, either with the class or with me afterwards, I welcome such discussion as an appropriate part of our coursework.
That’s it. That’s my content warning. That’s all it is.
I should say as well that nothing in these two paragraphs represents a change in my teaching practice. I have always assumed that if a student steps out of the classroom they’ve got a good reason, and I don’t keep tabs on them when they do. If a student is made uncomfortable by something that happens in class, I’m always glad when they come talk to me about it — I’ve found we usually both learn something from such exchanges. And of course students are still responsible for mastering all the course material, just as they’ve always been.
So why the note, if everything in it reflects the rules of my classroom as they’ve always existed? Because, again, it’s my job as a professor to facilitate class discussion.
A few years ago one of my students came to talk to me after class, distraught. She was a student teacher in a New York City junior high school, working with a social studies teacher. The teacher was white, and almost all of his students were, like my student, black. That week, she said, one of the classes had arrived at the point in the semester given over to the discussion of slavery, and at the start of the class the teacher had gotten up, buried his nose in his notes, and started into the lecture without any introduction. The students were visibly upset by what they were hearing, but the teacher just kept going until the end of the period, at which point he finished the lecture, put down his papers, and sent them on to math class.
My student was appalled. She liked these kids, and she could see that they were hurting. They were angry, they were confused, and they had been given nothing to do with their emotions. She asked me for advice, and I had very little to offer, but I left our meeting thinking that it would have been better for the teacher to have skipped that material entirely than to have taught it the way he did.
History is often ugly. History is often troubling. History is often heartbreaking. As a professor, I have an obligation to my students to raise those difficult subjects, but I also have an obligation to raise them in a way that provokes a productive reckoning with the material.
And that reckoning can only take place if my students know that I understand that this material is not merely academic, that they are coming to it as whole people with a wide range of experiences, and that the journey we’re going on together may at times be painful.
It’s not coddling them to acknowledge that. In fact, it’s just the opposite.
Angus Johnston teaches history at Hostos Community College and is the proprietor of the website studentactivism.net.
Harvard faculty survey finds that institution has made significant progress in providing female professors with mentors. But when it comes to caring for children or the chores, gender gap is significant.
I hadn’t intended to write one of these letters, ever. I thought that loyalty was part and parcel of being a colleague; however, I wasn’t put on the course schedule after two decades of teaching here.
You let me discover this by myself – with no explanation. And the timing could not have been worse. My spouse is unemployed; our child is in college. We may have to leave our home.
I know: There are hard times all over. Why should it -- or could it -- be different for my family?
When nonrenewals happen, one’s imagination runs wild. If there was some perceived deficiency for which I was nonrenewed, it’s probably better to know, though my self-esteem is currently flattened. And if it were simply an error, it would seem natural that an error could be quickly fixed. Instead, I am in limbo.
If my nonrenewal was (as someone close to me suggested) due to adjunct activism, that could be devastating – but true. “Oh, now I understand why that topic was important to you,” a family member said.
Alternatively, you may not be mulling over any of this. As a distant member of the busy department, I am probably not on your radar. Perhaps the department never really knew me fully as a teacher or scholar. The few times I tried to discuss my own intellectual life or community activities or writing, tenured colleagues appeared uninterested. A friend was even told: “Don’t talk about your ideas to colleagues too much.”
Like others in academia, you may assert that responsibility for sustaining or creating positions lies above or beyond – the dean’s office, the provost, the VPs , the president, the board of trustees, even trends around the country.
But while I am wondering how I will meet next year’s expenses and pursue what I consider my vocation, I am also wondering if you could help stem the erosion of positions. You might be able do this: if not for my generation, then for the next. You do have the power.
Perhaps you can show me that my bad-day comparison of the role of adjuncts in the university “family” as comparable to forgotten kids in the homes of the distracted rich is not valid. Perhaps you can show me that fierce battles you fight elsewhere in the university arena and within your scholarly discipline can be fought for less visible colleagues. Perhaps you can go to the mat for your department as a whole and possibly the future of your … our … academic discipline.
Some people think instructors of a certain age have lost their currency, in every meaning of the word. I may find it hard to buy groceries and may need to take out a loan to buy required health insurance – I lack that currency -- but I never lost my intellectual currency. If you think your adjuncts are stagnant or too tired to excel, do something. Evaluate, provide in-service … and be prepared to discover that you might be wrong.
An energetic, dedicated colleague with 40 years as an adjunct was extremely depressed in fall. I had never seen her as anything other than capable and charismatic. Nonrenewed. No perceived deficiency in her skills – rather, new colleagues, new chair.
Another colleague has left the country, tired of not knowing how she would pay her bills.
I am now down at least one-third of my anticipated $30,000 income in a good year for teaching 10 to 13 courses annually at various schools. Ultimately, there is no Machiavelli guide to being an adjunct, though one might try to be strategic.
Personally, I rolled with the course assignments and never fussed when things didn’t go my way. It has been suggested to me by someone outside of academia that too smooth an employee may be perceived as disengaged. Want two classes? Get one … or expect two, then get one, if that. Always be prepared to be “bounced,” no matter what your load. Risk overload at multiple schools rather than not being able to pay bills. Teach morning, noon, night, weekend, online.
Some may be thinking: Get a real job? Jobs are not abundant in my region. Publishing? Dwindling. Libraries? Shrinking. Bookstores? Nonexistent. Human services? Despite rhetoric about our society’s mental health needs, few openings.
Alt-ac jobs on campus or lectureships at two-year schools? Have tried. Private high schools? Few slots, no go.
Someone said recently: I can’t imagine why an adjunct would keep at it after three years. I tried to find other paths. Ironically, every time I have applied for a full-time job that has not come through, full-time and part-time colleagues have said, “But you don’t really need the job. You have a spouse.” Is this the 21st century?
A well-meaning friend offered that a door shutting might mean a window opening. It feels, to me, like the door is shutting and the windows are painted shut.
Exit strategy and career plan are, of course, ultimately one’s own responsibility.
While I figure out what I can for myself: Can there please be forward thinking in colleges or universities on how to cultivate, advance or utilize existing talent without strategies that boot talented instructors out – deliberately or accidentally -- in our maturity? Other industries value retention and experience.
And when it comes to classroom management, literacy acquisition, writing skills, minority outreach: Believe me, adjuncts can enter a campus discussion, given the chance.
Those on this path should be careful. One may end up vulnerable while sick or dead after a termination, or -- as I sense myself becoming -- dejected. And as the case of Mary-Faith Cerasoli recently retaught me, I may be one illness or mishap away from the street.
This century may see things getting worse for adjuncts. In the unsolicited words of a former full-timer who left for greener pastures, “Don’t get caught” in the part-time pool.
But one could get caught.
Or set free at the absolutely worst moment.
The author has been a college instructor for more than 20 years.
In 1869, Charles W. Eliot, a professor at the Massachusetts Institute of Technology, wrote an essay in The Atlantic Monthly entitled “The New Education.” He began with a question on the mind of many American parents: “What can I do with my boy?” Parents who were able to afford the best available training and did not think their sons suited for the ministry of a learned profession, Eliot indicated, sought a practical education, suitable for business “or any other active calling”; they did not believe that the traditional course of study adopted by colleges and universities 50 years earlier was now relevant. Less than a year later, Eliot became president of Harvard. Among the reforms he initiated were an expansion of the undergraduate curriculum and substantial improvement in the quality and methods of instruction in the law school and the medical school.
The debate between advocates of traditional liberal learning and partisans of a more “useful” education, Michael Roth, the president of Wesleyan University, reminds us, has deep roots in American soil. In Beyond the University, (Yale University Press) he provides an elegant and informative survey of the work of important thinkers, including Benjamin Franklin, Thomas Jefferson, Ralph Waldo Emerson, W.E.B DuBois, Jane Addams, William James, John Dewey, and Richard Rorty, who, despite significant differences, embraced liberal education because it “fit so well with the pragmatic ethos that linked inquiry, innovation, and self-discovery.” At a time in which liberal learning is under assault, Roth draws on the authority of these heavyweights to argue that “it is more crucial than ever that we not abandon the humanistic frameworks of education in favor of narrow, technical forms of teaching intended to give quick, utilitarian results."
Most of Beyond the University is devoted to claims by iconic intellectuals about the practical virtues of liberal learning, which Roth endorses (with occasional qualifications). Exhibiting a “capacious and open-ended” understanding of educational “usefulness,” Roth indicates, Thomas Jefferson opted for free inquiry at his university in Charlottesville, Va., to equip citizens in the new republic to think for themselves and take responsibility for their actions. Ralph Waldo Emerson resisted education as mere job training; but, he indicated, it should impart knowledge to develop individuals willing and able to use what we now call “critical thinking” to challenge the status quo.
Acknowledging that different people need different kinds of educational opportunities, W.E.B. DuBois nonetheless insisted that the final product of training “must be neither a psychologist nor a brick mason, but a man.” Liberal learning, Jane Addams emphasized, inculcates “affectionate interpretation,” which prepares individuals not only to defend themselves against those with different points of view, but to empathize with others and act in concert with them. And John Dewey, the most influential philosopher of education in the 20 century, looked to a liberal education, according to Roth, to help students learn the lessons of experiment and experience, by trying things out and assessing the results, by themselves and with others, and, then, if appropriate, revising their behavior.
Roth’s approach – a reliance on the authority of seminal thinkers – is not without problems. As he knows, the nature of higher education – and its perceived roles and responsibilities – has changed dramatically since colleges focused on liberal learning. In 1910, only 9 percent of students received a high school diploma; few of them went on to college. These days, about 40 percent of young men and women get a postsecondary degree. Undergraduate, master’s, and doctoral degrees, moreover, are now required, far more than were in the days of Emerson and Eliot, for entry into the most prestigious, and high-paying, professions. Jamie Merisotis, president of the Lumina Foundation, is surely right when he asserts that “to deny that job skills development is one of the key purposes of higher education is increasingly untenable” – and that integration of specific skills into the curriculum can help graduates get work and perform their assigned tasks well.
Roth does not specify how liberal learning might “pull different skills together in project-oriented classes.” Nor does he adequately address “the new sort of criticism” directed at liberal learning. A liberal arts education, many critics now claim, does not really prepare students to love virtue, be good citizens, or recognize competence in any field. As Roth acknowledges, general education, distribution requirements, and free electives are not effective antidotes to specialization; they have failed to help establish common academic goals for students. And, perhaps most disturbingly, doubt has now been cast on the proposition that the liberal arts are the best, and perhaps the only, pathway to “critical thinking” (the disciplined practice of analyzing, synthesizing, applying, and evaluating information).
President Roth may well be right that liberal learning “will continue to be a fundamental part of higher education” if (and, he implies, only if) it rebalances critical thinking and practical exploration. The key question, it seems to me, is how to rebalance, while preserving the essence of liberal learning, at a time in which higher education in general and, most especially, the humanities are under a sustained attack by cost-conscious advocates of an increasingly narrow vocationalism, who are certain to be unpersuaded by the testimony of long-dead intellectuals. The task, moreover, is all the more daunting, moreover, because it will have to be carried out by proponents and practitioners of the liberal arts, many of whom, unlike Michael Roth, are now in despair, in denial, or have lost faith.
Glenn C. Altschuler is the Thomas and Dorothy Litwin Professor of American Studies at Cornell University.