Trigger warnings in the classroom have been the subject of tremendous debate in recent weeks, but it’s striking how little the discussion has contemplated what actual trigger warnings in actual classrooms might plausibly look like.
The debate began with demands for trigger warnings by student governments with no power to compel them and suggestions by administrators (made and retracted) that faculty consider them. From there the ball was picked up mostly by observers outside higher ed who presented various arguments for and against, and by professors who repudiated the whole idea.
What we haven’t heard much of so far are the voices of professors who are sympathetic to the idea of such warnings talking about what they might look like and how they might operate.
As it turns out, I’m one of those professors, and I think that discussion is long overdue. I teach history at Hostos Community College of the City University of New York, and starting this summer I’m going to be including a trigger warning in my syllabus.
I’d like to say a few things about why.
An Alternative Point of View
Seven humanities professors offer
10 reasons that "trigger warnings"
are counterproductive. Read more.
To start off, I think it’s important to be clear about what trigger warnings are, and what purpose they’re intended to serve. Such warnings are often framed — and not just by critics — as a “you may not want to read this” notice, one that’s directed specifically at survivors of trauma. But their actual purpose is considerably broader.
Part of the confusion arises from the word “trigger” itself. Originating in the psychological literature, the term can be misleading in a non-clinical context, and indeed many people who favor such warnings prefer to call them “content warnings” for that reason. It’s not just trauma survivors who may be distracted or derailed by shocking or troubling material, after all. It’s any of us, and a significant part of the distraction comes not from the material itself but from the context in which it’s presented.
In the original cut of the 1933 version of the film "King Kong," there was a scene (depicting an attack by a giant spider) that was so graphic that the director removed it before release. He took it out, it’s said, not because of concerns about excessive violence, but because the intensity of the scene ruined the movie — once you saw the sailors get eaten by the spider, the rest of the film passed by you in a haze.
A similar concern provides a big part of the impetus for content warnings. These warnings prepare the reader for what’s coming, so their attention isn’t hijacked when it arrives. Even a pleasant surprise can be distracting, and if the surprise is unpleasant the distraction will be that much more severe.
I write quite a bit online, and I hardly ever use content warnings myself. I respect the impulse to provide them, but in my experience a well-written title and lead paragraph can usually do the job more effectively and less obtrusively.
A classroom environment is different, though, for a few reasons. First, it’s a shared space — for the 75 minutes of the class session and the 15 weeks of the semester, we’re pretty much all stuck with one another, and that fact imposes interpersonal obligations on us that don’t exist between writer and reader. Second, it’s an interactive space — it’s a conversation, not a monologue, and I have a responsibility to encourage that conversation as best I can. Finally, it’s an unpredictable space — a lot of my students have never previously encountered some of the material we cover in my classes, or haven’t encountered it in the way it’s taught at the college level, and don’t have any clear sense of what to expect.
For all these reasons, I’ve concluded that it would be sound pedagogy for me to give my students notice about some of the challenging material we’ll be covering in class — material relating to racial and sexual oppression, for instance, and to ethnic and religious conflict — as well as some information about their rights and responsibilities in responding to it. Starting with the summer semester, as a result, I’ll be discussing these issues during the first class meeting and including a notice about them in the syllabus.
My current draft of that notice reads as follows:
Course Content Note
At times this semester we will be discussing historical events that may be disturbing, even traumatizing, to some students. If you ever feel the need to step outside during one of these discussions, either for a short time or for the rest of the class session, you may always do so without academic penalty. (You will, however, be responsible for any material you miss. If you do leave the room for a significant time, please make arrangements to get notes from another student or see me individually.)
If you ever wish to discuss your personal reactions to this material, either with the class or with me afterwards, I welcome such discussion as an appropriate part of our coursework.
That’s it. That’s my content warning. That’s all it is.
I should say as well that nothing in these two paragraphs represents a change in my teaching practice. I have always assumed that if a student steps out of the classroom they’ve got a good reason, and I don’t keep tabs on them when they do. If a student is made uncomfortable by something that happens in class, I’m always glad when they come talk to me about it — I’ve found we usually both learn something from such exchanges. And of course students are still responsible for mastering all the course material, just as they’ve always been.
So why the note, if everything in it reflects the rules of my classroom as they’ve always existed? Because, again, it’s my job as a professor to facilitate class discussion.
A few years ago one of my students came to talk to me after class, distraught. She was a student teacher in a New York City junior high school, working with a social studies teacher. The teacher was white, and almost all of his students were, like my student, black. That week, she said, one of the classes had arrived at the point in the semester given over to the discussion of slavery, and at the start of the class the teacher had gotten up, buried his nose in his notes, and started into the lecture without any introduction. The students were visibly upset by what they were hearing, but the teacher just kept going until the end of the period, at which point he finished the lecture, put down his papers, and sent them on to math class.
My student was appalled. She liked these kids, and she could see that they were hurting. They were angry, they were confused, and they had been given nothing to do with their emotions. She asked me for advice, and I had very little to offer, but I left our meeting thinking that it would have been better for the teacher to have skipped that material entirely than to have taught it the way he did.
History is often ugly. History is often troubling. History is often heartbreaking. As a professor, I have an obligation to my students to raise those difficult subjects, but I also have an obligation to raise them in a way that provokes a productive reckoning with the material.
And that reckoning can only take place if my students know that I understand that this material is not merely academic, that they are coming to it as whole people with a wide range of experiences, and that the journey we’re going on together may at times be painful.
It’s not coddling them to acknowledge that. In fact, it’s just the opposite.
Angus Johnston teaches history at Hostos Community College and is the proprietor of the website studentactivism.net.
There has been extensive hand-wringing about what can be done to help young graduates succeed in today’s tough labor market – especially in the spring, as high school seniors decide on their college offers, and college seniors prepare to graduate and face the world. Unemployment and underemployment rates among recent college graduates in the United States – largely a result of the recession’s lingering damage – are too high. And we’ve all seen the headlines questioning the value of college and the surveys that show employers bemoaning the “preparedness gap.”
But I am full of optimism.
As a university president, I spend far too much time among skilled, talented, motivated young people to be anything but hopeful about the future of higher education and the capabilities of the millennial generation – those born roughly between the early 1980s and the early 2000s. And honestly, surveys by my institution, Bentley University, of recruiters and students don’t reflect these headlines.
It’s perplexing. Is there such a disconnect to good jobs with this generation? And if there is one, let’s figure out how to resolve it instead of repeatedly touting the problem. So we chose to dig a little deeper and try to uncover the real issues. How do key stakeholders actually view the preparedness issue? And, more important, what will it take to ensure that millennials are fully prepared to succeed in the workplace?
We commissioned KRC Research to conduct a comprehensive preparedness survey of over 3,000 stakeholders, including employers, higher education leaders, students, parents, and recent college graduates. The survey found consensus in surprising places -- from rating recent graduates’ level of workforce preparedness to defining exactly what preparedness means.
One of the most interesting set of findings revealed that businesses are conflicted about the skills they want in their new employees and, consequently, are sending mixed messages to the marketplace. A majority of business decision-makers and corporate recruiters say that hard and soft skills are equally important for success in the workplace. (Hard skills are tangible ones, such as a student’s technical and professional skills, while soft skills include communicating well, teamwork and patience.)
Yet when asked to assess the importance of a comprehensive set of individual skills, business leaders put soft skills at the top of their list and industry and job-specific skills at the bottom; only 40 percent of employers say that the latter are important to workplace success. But while employers say soft skills are vital to long-term career success, they prefer to hire candidates with the industry-specific skills needed to hit the ground running, even if those candidates have less potential for future growth.
In the face of such conflicting information from employers, how should students and educators respond? Should they emphasize soft skills or hard skills?
The answer: This is a false choice. Students don’t need to – and shouldn’t have to – choose between hard and soft skills. It’s important for colleges to arm students with both skill sets -- whether a student is majoring in business or literature. By developing curriculums that fuse liberal arts and professional skills and by providing hands-on learning experiences, we can give our students the range of skills that are critical for the modern workplace.
This “fusion” was one of the popular solutions tested in the survey, and many schools are doing it already. Brandeis University, a private university with a liberal arts focus, says that its new undergraduate business program is already one of its most popular majors. (Brandeis points out that most of its business majors are double majors.) At West Virginia University, the College of Business and Economics and the School of Public Health have partnered to create a dual-degree program that will infuse business skills into the field of public health. At Georgetown’s McDonough School of Business, students in the freshman “Ethics of Entrepreneurship” seminar take on a semesterlong project designed to help them flex their critical thinking and writing muscles in a global and social framework.
Bentley has also adopted several strategies to ensure we are preparing our students for success. Virtually every student here majors or minors in business, while simultaneously pursuing a core of arts and sciences courses that focus on expanding and inspiring traditional “business” thinking. We recently expanded on our popular liberal studies major, an optional second major combined with a business major, by launching six-credit “fusion” courses co-taught by business and arts and sciences faculty. Combinations include a management course (Interpersonal Relations in Management) with an English course (Women and Film) to explore how women are perceived in film and how this can affect management styles; and a global studies course (U.S. Government and Politics) with an economics course (Macroeconomics) to teach how politics and economics work together and to demonstrate that understanding both is often essential to doing either one well.
All this study must be combined with hands-on, “experiential” learning – the pathway to hard skills. This is where business organizations can play an important role. Santander, the global, multinational bank, created a scholarship program to support academic, research, and technological projects – we are proud to be one of the 800 institutions in their program. Corporate partners can also help shape curriculums to teach skills as they are actually practiced in the workplace. EY LLP (formerly Ernst and Young) worked closely with us to merge accounting and finance for freshmen and sophomores, since those disciplines are inextricably linked in the business environment.
These strategies aim to equip students with both hard and soft skills and they can be adopted and adapted by many colleges. A challenge in higher education is that some academic models can be so discipline-specific that students miss out on cross-disciplinary opportunities to integrate their knowledge. But it doesn’t have to work this way.
Like other colleges and universities that are innovating and experimenting, we are seeing returns on this curricular investment. One way to measure this: our survey of the Class of 2013 shows that 98 percent of responding graduates are employed or attending graduate school full time (this includes information from 95 percent of the class). Retention, number and availability of internships and repayment of student debt are also key metrics.
I encourage my higher education colleagues to refocus their attention on the ways we can work together to strengthen our education models. Millennials, a group that includes our current students, are counting on us to prepare them for successful careers and life. And in the long run, it is an economic imperative that we do so.
Gloria Cordes Larson is president of Bentley University.
Americans don’t like cheaters. When it comes to how we learn and what we’re able to do with our acquired knowledge, a game has been going on. And many will find themselves systematically locked out of opportunity.
This is not about students cheating on tests or principals downplaying ineffective teaching strategies. Nor is it about the latest argument concerning higher education — that college is too expensive and there’s no guarantee of gainful employment. It a national reckoning of how much we’re willing to tolerate regarding class, status and the suppression of economic mobility. This issue demands that we take responsibility for the way that our educational decisions play out in our lives and throughout our communities. Until we take ownership of these things, we will continue to play a fool’s game of winners and losers.
For the vast majority of Americans — myself included — a college education remains the key to an engaging, financially viable life. Nothing should be done to disrupt this trusted vehicle by zeroing in on the undergraduate degree solely as preparation for a first job whose “of-the-moment” skills and knowledge are likely be eclipsed in short order in a rapidly changing economy.
I am a first-generation college student. My father, while I was growing up ,was an assembly line worker making wooden boxes and a cook at a hospital. My mother did not work outside the home.
It was their conviction that I would receive an education that those who traditionally succeeded -- generation after generation -- in America already enjoyed.
And that was a liberal arts education. My parents didn’t really understand what a liberal arts education was, but they knew they wanted it for me. I was not about to be cheated out of an education that would not carry me through a lifetime of self-inquiry, engagement and changing job opportunities.
But today’s stormy economy, and with it, constant rhetoric from self-appointed critics dubbing the liberal arts “useless” as opposed to training for a first job, cause people to have doubts. The liberal arts — even as a complement to vocational education as in Germany where the technical economy is thriving — are glibly declared without value.
Is it not suspicious, however, that at the very time more and more aspiring students from challenging, non-middle-class backgrounds seek higher education, those who have already achieved, often on the basis of a liberal education, want to redefine the rules?
This is the game. It’s as close as America gets to hereditary power. And it is won in two relatively simple steps: redefine the very notion of student success on the basis of landing that first job; and keep those without privilege away from the liberal arts — a historical source of power and mobility in the middle-class culture that defines higher education. There is no doubt that those who are trying to tear down the traditional undergraduate degree would not permit their own children to be limited to a strictly vocational education.
So, while we are encouraged to fret over college costs, the marketability and uselessness of the philosophy major, or something else similarly distracting, we’re letting the great equalizer of the college degree and a trusted path to leadership get away from us.
Some institutions are not waiting for everyday Americans to catch on. At the University of Baltimore, for example, a state institution committed to open-access admissions at the undergraduate level, a rigorous examination of the challenges faced by and resources available to its students for success has been taking place for the past year. The goal is to provide students the academic and non-academic interventions that help them complete a career-oriented college-level course of study at a reasonably low cost and in a reasonable amount of time. These students didn’t grow up believing that they have all the time in the world to mature. They were not told every day that they are “great.” Many are first generation college students and come from nontraditional pathways to the university. There is a mix of ages. Urgency defines these students’ ambition. The university has no intercollegiate athletics and its residence facilities are minimal. The city often serves both its students’ residential and social life.
But in its effort to increase student success, the university is not forgoing its historic commitment to the applied liberal arts. It offers relatively modest number of majors that are preparatory to a range of careers — business, criminal justice, human sciences and management, digital communications, simulation and digital entertainment, psychology, jurisprudence, and integrated arts. All of these are taught within a liberal arts context. At its curricular core, the university has always been about a productive, at once imaginative, intersection of theory and practice defined by applied liberal arts in the service of employment.
For example, all publication design majors are required to complete Visual/Verbal Rhetoric and all digital communication majors are required to take Rhetoric of Digital Communications. Both courses are based in rhetorical theory from Aristotle and Burke to McLuhan, Toulmin and Barthes. Students analyze and apply aesthetic and rhetorical theory to visual products -- advertisements and other graphic-design materials, television shows and movies, public relations and marketing. The infusion of the liberal arts into the fundamentals of applied courses of study began with the introduction of programs often decades ago. The deliberate infusion of practical courses with the liberal arts is further strengthened by locating them within the Yale Gordon College of Arts and Sciences. Here is a university where open-access students are not about to be intellectually shortchanged even when facing the imperative to transition from study to work.
When our nation’s founding fathers made their original commitment to higher education, they envisioned a useful liberal arts education that would permit citizens to participate productively over a lifetime in the social, political and economic arenas of democracy. There was no dichotomy between the liberal arts and employment in the distinctively American college education.
Today, we risk this potential for a meritocracy. Ralph Waldo Emerson asserted centuries ago in his essay, "Self Reliance," that a distinction of the American people — a key to their inventiveness and advancement — is the ability to entertain two seemingly contradictory notions at once. Those who would drive a wedge between the liberal arts and jobs are destroying that distinction and limiting human potential. A culture of inherited privilege is still doggedly hanging on to supplant the individual’s talent and ambition. The walls are still up and being defended under the seductive guise of a narrow education for the first job. And a lot of folks are being cheated as college success is redefined. For when college success is redefined, so is life success.
William G. Durden is president emeritus of Dickinson College and a newly appointed research professor in the Johns Hopkins School of Education and operating partner at Sterling Partners. This essay is adapted from a talk he gave at the University of Baltimore.
Most of my faculty colleagues agree that Writing Across the Curriculum (WAC), in which the task of teaching writing is one assigned to all professors, not just those who teach English or composition, is an important academic concept. If we had a WAC playbook, it would sound something like this: students need to write clear, organized, persuasive prose, not only in the liberal arts, but in the sciences and professional disciplines as well. Conventional wisdom and practical experience tell us that students’ ability to secure jobs and advance in their careers depends, to a great extent, on their communication skills, including polished, professional writing.
Writing is thinking made manifest. If students cannot think clearly, they will not write well. So in this respect, writing is tangible evidence of critical thinking — or the lack of it -- and is a helpful indicator of how students construct knowledge out of information.
The WAC playbook recognizes that writing can take many forms: research papers, journals, in-class papers, reports, reviews, reflections, summaries, essay exams, creative writing, business plans, letters, etc. It also affirms that writing is not separate from content in our courses, but can be used as a practical tool to apply and reinforce learning.
More controversial — and not in everyone’s playbook -- is the idea that teaching writing skills cannot be delegated to a few courses, e.g., first-year composition courses, literature courses, and designated “W” (writing-intensive) courses. Many faculty agree with the proposition that writing should be embedded throughout the curriculum in order to broaden, deepen and reinforce writing skills, but many also take the “not in my back yard” approach to WAC.
We often hear the following refrains when faculty discuss students and writing. Together they compose a familiar song (sung as the blues):
1. “I’m not an English teacher; I can’t be expected to correct spelling and grammar.”
2. “I don’t have time in class to teach writing — I barely have enough time to teach content.”
3. “Why should students be penalized for bad writing if they get the correct answer?”
4. “Mine isn’t supposed to be a ‘W’ course, so I’ll leave the writing to others.”
5. “There is no way to work writing into the subject matter of my course.”
6. “They hate to read and write and won’t take the time to revise their work.”
7. “I don’t have a teaching assistant and don’t want to do a lot of extra correcting—I have enough to do.”
8. “Our students come to college with such poor writing skills that we can’t make up for years of bad writing.”
9. “They never make the corrections I suggest; I see the same mistakes over and over again, so why bother?”
10. “They’re seniors, and they still can’t write!”
Much has been written about WAC, and I add my voice to the multitudes because I recently came to a realization, watching my students texting before class began: students spend hours every day reading and practicing writing — bad writing. How many hours are spent sending and reading tweets, texts and other messages in fractured language? It made me wonder: is it even possible to swim against this unstoppable tide of bad writing? One of my colleagues argues that students cannot write well because they don’t read. I think that students do read, but what they spend their time reading is not helpful in learning how to write. (That, however, is a discussion for another day.)
I’m not sure that all students can be taught to improve their writing, but I am sure that it is one of the most important things we can attempt to teach. What difference does it make if students know their subject matter and have excellent ideas if no one can get past their sloppy and disorganized writing?
Let us consider (with annoying optimism) those sad faculty refrains.
“I’m not an English teacher; I can’t be expected to correct spelling and grammar.”
But we are college professors; we know more about writing than our students do. What you could do, if you don’t want to make corrections yourself or are stymied by the magnitude of a particular writing problem (where to begin?), is circle areas for revision and require the student to submit the work to the tutoring or writing center before a grade will be given. (You can even allow several opportunities for revision, depending on your tolerance for pain.) You can designate a certain number of points in your rubric to writing mechanics, letting students know that their grades will be affected by their writing; human nature being what it is, students pay more attention when they know they will be graded.
Most important, we can all emphasize that writing is important in our disciplines and that students will be judged in the workplace on the basis of their writing skills. We can all convey the message that polished prose matters to us and to professionals in our field — so much so that we are taking points off for sloppy work.
“I don’t have time in class to teach writing — I barely have enough time to teach content.”
Do you have time to assign minute papers at the beginning or end of each class, asking students to summarize three things they learned, or pose a question related to the day’s work, or answer one question based on the previous reading assignment? These papers are short and easily graded; they help students internalize and reinforce content.. They each can be worth a few points, based on quality. If assigned on a regular or irregular basis (like a pop quiz), you may even get students to keep up with the reading and pay more attention in class. Minute papers encourage students to organize their thoughts; I discovered that students who could not speak coherently in class sometimes produced thoughtful short essays. Writing can be used in many ways to learn content and improve fluency and writing proficiency.
“Why should students be penalized for bad writing if they get the correct answer?”
Bcuz omg in the workplace they will be penalized for it. Ignoring student errors is like ignoring the piece of spinach in someone’s teeth; it may seem kind not to say anything, but no one really benefits. We can assign more writing in our courses, but if it is never graded, it may improve fluency but not accuracy — and confirm bad writing habits. Take a guess: over four years, what percentage of written assignments at your institution is graded for writing mechanics as well as content?
“Mine isn’t supposed to be a ‘W’ course, so I’ll leave the writing to others.”
Leaving WAC to others is like leaving voting to others. If WAC is viewed as an institutional playbook, it implies that everyone is part of the team and plays a position. All courses should be writing courses with a small w if not a big W; that is the only way to convey the message that what students learn in Composition 101 is relevant to success in their upper-level psychology course or business minor. Furthermore, since each discipline has its own rhetoric, it is particularly important for students to practice the specific types of writing they will be asked to produce in their careers. They will not be exposed to professional writing in their first-year seminars and English composition courses.
“There is no way to work writing into the subject matter of my course.”
Physicists, pathologists, geologists, mathematicians, dentists, lab technicians, engineers, architects, web designers, curators, forensic anthropologists and others have to explain things in writing; in an algebra course, for example, students could explain their reasoning on a given problem. No matter what the field, the ability to organize information in writing is a key professional asset, whether writing is used in a patient history, business contract or gallery brochure. We can invent ways to bring theory into practice by creating opportunities for students to write in the language of their careers.
“They hate to read and write and won’t take the time to revise their work.”
Yes, for many of our students, academic reading and writing seem to be unnatural acts. Some students, for example, seem much more themselves, much more authentic and engaged, on the soccer or football field.
One day in late autumn, on a perfect, still, golden afternoon, I stopped to watch the football team practice. The camaraderie, the sense of purpose, the sheer joy were poignant, as I pictured these young men paying mortgages and sitting in cubicles. Our job is to coach them safely into their futures, into different green pastures. Part of the playbook for that is to insist that they improve their writing skills so that their writing does not undercut their potential — even if they are not there yet, not fully ready to commit to academic work.
My other thought that afternoon was, can we make learning as engaging and authentic as sport? We each have to answer this question in our own way. In my law classes, for example, I ask students to write legal memorandums using the IRAC method: “You are a junior associate in the firm of Flake, Moss and Marbles, and your senior partner wants you to research and write a memo on the case of Madame X, who… .” The IRAC method not only structures the memo for students (they summarize the facts of the case, Identify the legal issues, cite the relevant Rules of law, Analyze the problem based on the facts and law, and draw a Conclusion on the likely outcome of the case), but allows them to role-play a real-world situation. They complete a series of these short writing exercises, with a rubric to guide them, and have several opportunities to revise their work.
For a formal or high-stakes writing assignment, scaffolding is essential; students will perform better when the structure of the writing assignment is broken down into components, which, when assembled, produce a coherent whole. The IRAC method has a built-in scaffold, but other writing assignments can be structured into a series of elements or steps. It is a mistake to assume that students know how to organize a paper or report; let them know what you are looking for, break down the structure into elements, and if you have a good sample of what you expect, hand it out. (Save your students’ work for this purpose.)
In my mediation class, students are asked to draft an agreement based on a mediation role-play they have participated in. The agreements follow a structured blueprint. They are peer-edited, revised by the student (with a writing tutor, if necessary) and then corrected by me. Students are given model agreements from past years and have three opportunities to revise their work prior to grading. Last semester, 18 of 19 students revised their work and received As on the agreements. The agreements were polished and professional and reinforced the content taught in the course.
I believe that we can devise meaningful and engaging ways for students to write in all courses; the challenge is to explain to students why they are doing it. Writing should be like driver’s ed in students’ minds -- a practical skill that is essential to their future success. Without that connection, writing will seem more like juggling: nice if you can do it, but not an essential life skill.
“I don’t have a teaching assistant and don’t want to do a lot of extra correcting — I have enough to do.”
Most of us don’t have teaching assistants, but we do have students for peer editing, and writing or tutoring centers with support staff. Some degree programs have upper-class peer mentors who can help students with writing in the discipline. Consider ways to form a writing partnership, using the resources available to you. Personally, I prefer that students take responsibility for their revisions by seeking out support services. Somehow, it doesn’t seem kosher to make all these corrections, have students incorporate them into their next draft, and then grade my own language, saying “good word choice,” “nicely written,” or “well organized!” I like to circle areas for improvement, making general comments, not specific corrections.
“Our students come to college with such poor writing skills that we can’t make up for years of bad writing.” Some students will make little progress in improving their writing, for a variety of reasons. But if we accept students into our institutions, we should provide opportunities for them to improve their writing skills, even if some students are the proverbial horses who won’t drink. If students practice and are graded on their writing in only a few courses, they learn: 1) that in most courses they can get a decent grade without decent writing, and 2) that writing is relevant only in a few contexts. If we insist that career preparation includes the process of writing and revision, and we all assign meaningful writing exercises that students can revise and improve, the rest is up to them.
“They never make the corrections I suggest; I see the same mistakes over and over again, so why bother?”
When students start losing points, they tend to sit up and take notice. I’ve found that many mistakes are careless ones — what I call a document dump, turning in a first draft with no proofreading. If you hand back a draft and deduct points for writing errors, you will see more effort to correct those mistakes. Why should students devote time to an ungraded exercise when they can spend their time on something that will affect their grades? If sloppy writing has no impact on their grades, it makes sense for students not to internalize your corrections or prioritize revisions.
“They’re seniors, and they still can’t write!”
If we can agree about the value of a WAC playbook, not just in theory but in our daily practice; find ways to weave writing into all of our courses, not as busywork but as a meaningful part of the content we teach; assess student writing and promote it as an essential career skill; and allow students to revise their work, since revision is the heart and soul of the writing process, we are less likely to encounter seniors who have not practiced or improved their writing skills over four years. Our playbook should read that all courses, from now on, are writing courses with a small w.
Ellen Goldberger is director of the Honor Scholars Program and teaches law, leadership and conflict resolution courses at Mount Ida College.
When I began my career as a faculty member many decades ago, I had the good fortune to find myself in an especially distinguished department at an especially eminent research university. It was the custom of this department to gather for a faculty luncheon once a week and then to proceed to a departmental seminar in which we heard either from a visiting colleague or one of our own members. In the discussion period following the talk, questions generally had more to do with the ongoing research of the interlocutor than with the research of the speaker. Since all members of the department tended to be engaged in consequential research, the overall quality of the discussion was high -- although proceedings tended to take on a somewhat predictable, ritualized character
To be sure, department members were sincerely interested not only in their own research, but also in the research of their colleagues, and would often engage in conversation on these matters. This was known as discussing one’s “work.” Teaching was not considered a part of such “work,” even though many members of the department were dedicated, effective teachers. Teaching was basically a private matter between a faculty member and his or her students. I had the distinct sense that it would not be to my professional advantage to engage in discussion about my teaching; indeed, I sensed that it might be the conversational equivalent of a burp.
Back in the 1950s, the sociologist Alvin Gouldner did some interesting work about the culture of faculty members and academic administrators at a liberal arts college. He was following up on Robert Merton’s general idea about the social significance of “latent,” as opposed to “manifest” roles – that is, how roles not recognized explicitly, and not carrying official titles, might be of central importance in social life. In the academic context, manifest roles would include those of “dean,” “faculty member,” “student,” etc. The latent roles that Gouldner found especially important were those of “cosmopolitan” and “local”: roles that were not consciously recognized by overt labels, but which were consequential to the actual culture and social organization of the institution.
Cosmopolitans were those whose primary focus was their profession, as opposed to the institution where they were employed. Thus, a faculty member in this category would, for example, take a job at a more prestigious university that was stronger in his or her own field, even if it meant a lower salary. (Gouldner’s research was carried out at a time when it was apparently conceivable for a liberal arts college to offer a larger salary than a research university). Locals, on the other hand, were loyal first and foremost to the institution; they were usually not productive as scholars. At the time of Gouldner’s study, administrators generally fell into the category of locals.
Much has changed since that time. There has been, with a general move toward cosmopolitanism on the part of administrators, who have developed professional associations of their own and are more likely to go from one institution to another. As for faculty, their world has seen a widening gap between elite cosmopolitans and indentured locals -- adjuncts tied to low-paying jobs only relatively close to home, not the kind of locals who have been given any reason to develop institutional loyalty.
A question, then, for faculty members today is how best to balance concern for their profession with concern for their institution. A likely way is to care seriously and deeply for one’s students – since they are, after all, a major part of one’s vocation, in addition to paying most of the bills. And this means taking a more intentional, sophisticated approach to teaching.
To be sure, different institutions have different missions. Research universities, in particular, are crucial to the advancement of knowledge and must thus concern themselves with leading-edge science and scholarship. Even here, however, not all graduate students are themselves headed for major research universities -- far from it. Thus, graduate faculties in research universities are coming to feel responsible for preparing students for the future careers they will actually have. In part, this will mean exploring possibilities beyond the academy. It will also mean creating effective programs for preparing graduate students as teachers for a wide range of students.
The development of such programs has been a focus for the Teagle Foundation in recent years. This has involved supporting universities in their efforts to expose graduate students to what cognitive psychology has taught us about learning; to the pedagogical approaches and styles that have proven most effective; and to which forms of assessment are most relevant to the improvement of teaching. More generally, it means leading faculty to feel that they are not only a community of scholars, but also a community of teachers.
It has been suggested that the preparation of graduate students for teaching would be well-served if there were different faculty “tracks,” with some department members being primarily responsible for preparing researchers while others are primarily responsible for preparing teachers. While it is certainly true that not all members of a department have to make the same kind of contribution to the overall success of the program, formalizing such a separation between research and teaching would simply reinforce the caste system already in place -- not to mention the fact that many distinguished researchers are also exceptional teachers and that student engagement in research is an important teaching strategy. So, while there might be some value in having a pedagogical specialist (or more) on the roster, it is not desirable to have a tracking system that segregates teaching from research.
Here, then, is the general goal: just as faculty members would never think of being unaware of what peers are doing in the same field of research, so they should feel a comparable impulse to be aware of what their colleagues are doing in their areas of teaching. And thus, the world of higher education can become even more a true community.
Judith Shapiro is president of the Teagle Foundation and former president of Barnard College.