The Dreaded Grade Appeal

During a routine conversation about the semester, curriculum, and student population, a colleague of mine burst in with a frustrated comment about grade appeals. He thinks that we’re seeing more formal grade complaints than in past years. A dozen contacts at community colleges and universities seem to agree; we’re seeing more and more students going to the administration to complain about individual assignment grades, course policies, and final course grades. On a bad week, I will see more students in my office wrangling over assignment grades than those truly hoping to improve their academic performance. It’s depressing. Like many of my academic friends, I want to blame the generational divide for what looks like an increase in the number of grade appeals. After watching “I Love the 80’s” every night in a week, I want to wail and cry, mumbling that this new generation just doesn’t understand. They have no sense of what’s appropriate. They don’t respect authority. And their sense of entitlement is overwhelming. That, my friend, is what’s causing this increase in grade appeals across the nation.

Maybe. Maybe not.

When I off VH1 for a moment, I start to sort out some of what’s underneath this blanket statement that it’s us against them. Yes, the new Millennial students have a different sense of hierarchy than middle-aged folks like me. In the 70’s and 80’s, most administrators of businesses hid behind heavy doors and left customers to talk to counter staff or receptionists. Today, many businesses are transparent. The Internet allows customers to find out the name of the owner of even the largest business and with a click, e-mail them directly about a concern. In forums and chat rooms, anonymous posters can reveal an opinion about anything at any time. No one knows the poster’s age, gender, level of education, culture, or social status. In a way, this is the most democratic of processes. Of course this may have been one of many reasons why our traditional authoritative structure has shifted and changed in the last few decades. And this might explain the occasional “That’s just your opinion” response I receive when I return an essay to a student with comments and a rubric. After all, in the online world, all opinions seem to be of equal value. For the less experienced student, having one’s roommate, boyfriend, or role-playing forumites reading one’s work may be just as useful as having a trained tutor or instructor take the time to critically read and make suggestions. Maybe.

And maybe my students’ increased level of comfort at exposing one’s ideas online (or elsewhere) could help convince them that there is no hierarchy in knowledge — just fantastic bits and pieces of wisdom gleaned through online forums and blogs. Sewn together, this patchwork may seem just as valuable as the scholarly journal that is edited and produced by Ph.D.'s at a respected institution. And my students’ cauldron of original thought is available at 3 a.m. with the click of a button.

Sometimes I agree with colleagues who feel that the recession has not only forced students to feel desperate to get a degree, but also encouraged our administrators to reach farther and farther out to recruit students to support programs developed decades ago. And maybe we are approaching less qualified students. But I also know that I love teaching. And one reason for that is the occasional surprise brought on by what we would have called an “unqualified” student who suddenly becomes interested in a subject, changes his or her major, and pursues a certificate or degree — something that no one could have predicted. Lives are changed and generations feel the impact. For that I will slog through the stack of papers that simply restate the same lukewarm opinions again and again. After all, hidden in that towering stack (or the next stack) may be the paper that reveals an “Aha!” moment for a student who others may see as “unqualified.” This is the reward that goes beyond the student.

I do think what is behind the increase in grade appeals is more complex than a generational split. Some of the reasons for students’ grade appeals are age-old. Yes, our institutions are more transparent and administrators are more available. Yes, our administrators may be under increasing pressure from students, parents, and the community to provide a certificate or degree to a student where a high-school diploma may have sufficed 10 or 20 years ago. And yes, our digital native students may have more confidence questioning authority or structures that seemed inapproachable years ago. Still, according to a few administrators I’ve worked with, the complaints are often the same — vague class requirements, uneven enforcement of policies, and poor communication head the list.

After serving on a formal grade appeals panel at my community college, I vowed to simplify my own class policies and put into place some very comprehensive (and visible) statements on difficult topics like plagiarism and academic dishonesty. Not only do I state verbally and in writing what is necessary to pass my course — I now quiz my students so that I can reassure my administrators that on the first day of class, out of 24 students, 24 demonstrated that they understood the most important class policies and requirements. Of course this won’t guarantee that I won’t be suffer a formal grade appeal later that semester; still, it gives me some confidence that not only will I be able to show that my requirements were clear, but that the student had at one point reiterated those requirements to me.

Why the push to avoid grade appeals? Like other not-yet-tenured instructors, I realize that no matter how positive my reviews, if I receive too many grade appeals, I may not be given tenure. And my adjunct friends have it even worse. Complaints and grade appeals often mean not being offered work the next semester. And for those seeking full-time work, this can be the black mark that means no interview when the next full-time position becomes available. Experienced colleagues may see a certain number of complaints and grade appeals as healthy; they often indicate an instructor who is rigorously teaching the curriculum. Still, those of us who have been in education for some time understand how multiple grade appeals will be viewed by the administration. Reviewing one’s materials for clarity, spelling out expectations in many formats, and attempting to minimize miscommunication would have a positive impact on one’s teaching in any case.

This last year, I also talked to colleagues at length about how they handled attendance, absences, make-up work, and late work for their courses. I then altered my own policies to reward students for their attendance and hard work (the carrot) rather than punish them for a lack of attendance and missed work (the stick). Rather than assign a specific percentage for attendance and then take away points when students are not present, I now give students points for a short quiz given at the start of each class. I still have strict requirements for passing the course, but my mentors assured me that this small change would help students perceive me as fair and less cynical. In just one semester, I experienced a significant drop in the number of students who made the decision to march into my associate dean’s office to complain about my teaching (or grading).

One great read on grade appeals is Marcia Ann Pulich’s, “Student Grade Appeals Can Be Reduced” published in 1983 in Improving College and University Teaching. Although it’s dated, many of the concepts are still applicable. In short, Pulich advises professors to communicate grading policies clearly and stick to them. She advocates a simple grading method and recommends that professors check to see that students understand individual grades and how they relate to their final in-class grade.

An experienced colleague I know uses a simple computation for final grades — each assignment is worth points that add up to 1,000. Students can clearly see how they’re doing at any stage in the course. I weight grades, stating the total percentages for each area on my syllabus. I’ve also had support staff at my college add my name as a student to my Blackboard sites for all my courses. I then load in some grades for assignments, and project this overhead several times during the semester so that I can explain in detail what each percentage means to that assignment. Since this corresponds visually to the percentages listed on my syllabus, students often have fewer questions and complaints later in the semester.

Pulich advocates concrete responses to students’ inquiries. She states that on an essay, comments justify a lower grade. I also use a customized rubric that shows how a student fares in a number of areas including content, logic, structure, and mechanics. There’s no mystery to this rubric; in fact, students have already seen this instrument before they’ve completed their written work. Before we get started on that particular assignment, I not only show them sample student essays, but I also grade an essay (with comments and a rubric) in class on an overhead. This helps students understand what’s most important in their own work. They also feel less frustrated later if they don’t receive a perfect grade.

Like Pulich, I believe that some misunderstandings between student and instructor can be avoided by clear, concrete response in verbal and written communication. In my early teaching days, I might have written to a student, “I’m concerned about your recent rough draft. Please see me immediately.” Today I would write, “I am giving this paper a zero because outside sources are not cited. If I don’t hear from you by Friday, September 25th, I will consider this a case of plagiarism and you will be failed in this course. If you contact me before Friday, September 25th, I will allow you to rewrite this material for your final draft without a late penalty.” I then copy the e-mail to myself, print out a copy of the e-mail to deliver to the student in person at our next class meeting, and wait for a reply. If the student replies by e-mail, I keep a copy of that message in a digital folder for the course and reply, reiterating my instructions. Perhaps this sort of rigidity isn’t necessary with upper-level courses and graduate students; however, in my area (developmental- and transfer-level English), providing deadlines and penalties ensure that I get a response from the student, helps them understand exactly what they must do to succeed, and protects me in case there are questions later.

Pulich suggests being clear about course policies — including vague categories like “participation.” Depending on the professor and the course, “participation” might mean speaking up in courses, in other courses, it might mean simply attending class, being on time, and not leaving early. If students’ grades are impacted by “participation,” this must be carefully spelled out in writing to avoid misunderstandings later. She also advocates grading “blind” — that is, without a student’s name on typed-up work. This helps a professor keep from playing favorites and if this is not a problem, helps students see the grading process as more fair. With my hybrid and online courses, this is easy. When I use the assignment feature on Blackboard, I am often grading without a student’s name visible. With materials from traditional face-to-face courses, I often flip the first page of the essay over and start reading from that point. In both cases, I consult a rubric (customized for that assignment) again and again during a second read. This keeps me on target with the original assignment requirements.

Last, Pulich writes that one should be “human but fair.” Enforcing due dates and applying rules about late work (no credit, partial credit) for everyone keeps students from doing a slow burn and running to my administration as soon as class is over. This generation is surprisingly bold about sharing information about the grades they’ve received and how an instructor has treated them with other students. If I make an exception with one student, I can assume it will be common knowledge with my college’s student population almost instantly. But being “fair” is much easier than being “human.”

One strategy I’ve started to employ is an empathy line in e-mail replies to students’ requests. When students e-mail me with terrible news about their personal lives (a friend’s father died, they locked themselves out of their car, they broke up with their significant other) and ask to make up a quiz or turn in an essay late without a late penalty, I immediately reply with a sympathetic statement. I follow up with a comment reiterating my course policies and list something they can do to be prepared for the next assignment. In past years, I might simply have responded, “No. Please refer to my course policies.” Today, however, I respond with, “I’m so sorry that you’re having problems with your car. My course policies, however, state that students won’t be allowed to make up quizzes if they’re not in class. Do review Chapter Four so that you’ll be ready for the quiz on Wednesday. I’ll look forward to seeing you then.” Interestingly, the core information is exactly the same — “No.” But how I frame it makes the student feel heard and gives him or her the feeling that he or she has control over some part of his or her life.

This strategy, combined with my change in attendance rules has gone a long way in improving my reputation with students. And the number of students who have complained has dropped over 90 percent in two semesters. I can’t say that my fear of being criticized by students is less; but I do feel more confident that the degree of caring that I have for my students is somehow more visible. In my last stack of student evaluations, one student wrote that she was upset she wasn’t allowed to make up a quiz on a day that she was late for class, but also stated, “The instructor was always willing to help students in her office and was understanding — even if she couldn’t really change the rules. She seems to actually care about her students as people.” Other students commented that my grading was “tough,” but that I was a good instructor. To me it is the perfect balance. I’ll never be one of the fun, popular instructors whom students try to befriend through social networking sites, but I feel more and more convinced that the greater number of students who pass my course are truly prepared for the next course. That good feeling surpasses the feeling of making my students happy in the short run.

Nothing I do will guarantee that a student of mine won’t march into my dean’s office to complain. But providing clear course materials in a number of formats, defining and quantifying areas that will be graded, spelling out deadlines and penalties in course materials and e-mail communication, packaging a “no” with empathy, and testing students to ensure that they understand integral issues like academic dishonesty and plagiarism will give me confidence when I’m brought to a formal grade appeals panel.

Shari Dinkins
Author's email:

Shari Dinkins is an assistant professor at Illinois Central College.

Communication Breakdown

Not long ago, a woman I know got a phone call from a sibling who reported that one of their sisters had died a few hours earlier. It was painful news, if not unexpected given the sister’s long illness; the call was part of a narrative of grief that had been taking shape for a while. But in telling me about it, she also noted an odd and slightly awkward detail. She’d actually learned the news a bit earlier, on Facebook and via Blackberry, where it had been announced in a “status report” from her sister’s daughter.

My friend kept that part of the story to herself when dealing with relatives. As someone with a professional interest in information and communication technology, she’s very open-minded and curious about the way people use the tools now available, and this was no exception. But it was impossible to get around the sense that some breach of tradition was involved. It hadn’t bothered her, but she felt sure that any other member of her family much older than her niece would feel at least somewhat appalled.

An individual’s death is a rip in the social fabric. And communication among those closest to the deceased involves more than transmission of the news. It is process of patching up what remains of that fabric, a reinforcement of bonds. By some implicit rule, we take it as a given that family will get the news before it is available to a world of strangers. Not that things always happen that way, of course, but the exceptions are felt as such. (A man opens a newspaper and learns that his son was killed a few days earlier.... This is the stuff of melodrama – a situation implying circumstances so complex it would take a whole movie to explain.)

But now the grammar of social relationships is changing in ways it remains difficult to understand. Exactly what is happening when you share the pain of the death of a loved one with the world of your “Facebook friends,” that cloudiest known category of human connection? Anyone who wants to get all curmudgeonly about this should feel free to wail away. Yet doing so does not answer the question.Telling the world that my “status” is grief is not something I would be inclined to do. But it would be morally stupid to question the pain of anyone who finds this appropriate -- or to doubt whether they, too, are trying to reweave some part of the web of everyday life. The boundaries between private and public, between intimate and overt communication, are never absolute or fixed in any case.

Today those boundaries are blurrier. Maybe poets (the “antennas of the race,” as Ezra Pound put it) will be able to make sense of what it means for the human condition. I made the mistake recently of hoping that the social sciences would help. Some of my best friends are social scientists, so no offense intended, but reading a new book from the MIT Press called New Tech, New Ties: How Mobile Communication is Reshaping Social Cohesion was really not all that encouraging.

The author, Rich Ling, is identified on the cover as a senior researcher at the Norwegian telecommunications company Telenor and an adjunct research scientist at the University of Michigan. His methodology primarily involved following people around in public as they talked on their cell phones. I believe this brings the difference between ethnography and eavesdropping to an all-time minimum. Not quite half of the book is devoted to rehearsing the conceptions of social ritual worked out by Emil Durkheim, Erving Goffman, and Randall Collins. The rest is based on field notes, often supplemented by guesses about what the person on the other end of the phone call might have been saying.

Ling’s thesis, in short, is that mobile communication devices strengthen social connections through something akin to an interaction ritual. The argument hovers between insight and truism for quite a while before coming to rest on the obvious. Cell phones and text messaging create “a tightening in the individual’s social network that augurs against those who are marginally known to us and in favor of those who are familiar.” This is inarguable. Most of us do tend to speed-dial people we already know. (Plugging in the numbers of complete strangers might seem like a good idea after several bottles of whiskey, but not otherwise, and especially not the next morning.)

Unfortunately for the elegance of the whole enterprise, the main thrust of Durkheimian notions of social ritual is that they create or consolidate a sense of shared identity among people who do not necessarily have any close connection. This is even true of the sort of small-scale, face-to-face encounters described by Goffman and Collins. Their point is that even seemingly casual exchanges tend to follow established patterns that bind participants together by virtue of the fact that the routines are commonly accepted.

A contrarian (or really, just about anyone not employed as a researcher at a large telecommunications company) might well point out that mobile devices actually tend to dissolve social ritual. Any degree of formality -- let alone any expectation of shared attention by people sharing a common space -- is now precarious. The solemnity of a funeral has no guarantee against the vivacious force of a calypso ringtone.

The author discovers from his extensive observations that some people do try to mitigate the disruption that cell phones bring to “copresent” interactions. They may lower their voices, or practice certain gestures to indicate that they are sorry to be interrupting things. But evidence from my own corner of the global village would suggest this is not quite universal. It may be that Norwegians are more reserved than Americans.

So does it follow that codes of interaction are simply disappearing as reticence itself vanishes? Such is a common enough complaint, but things are not necessarily so straightforward as that.

The ubiquity of mobile communication devices means that the behaviors associated with them are more or less inescapable. As irritating or incomprehensible as those behaviors may be, our options for responding are limited. There is no sanctioned code for interacting with someone bellowing endearments into a Bluetooth at a coffee shop, or typing messages into a Blackberry in the front row while you are reading a paper at a conference. A few years ago, I proposed shooting people who talk on cell phones in libraries, if only with a taser; but in spite of generating considerable enthusiasm, this idea never really caught on.

In the absence of rules for confrontation, then, the rule is that confrontation must be avoided. Durkheim wrote that any given social order obliges us to “submit to rules of action and thought that we have neither made nor wanted and that sometimes are contrary to our inclinations and our most basic instincts.” To put this another way: Might as well get used to it.....

Scott McLemee
Author's email:

Lessons From V.I. Lenin and Father Roderick

I have just given the first examination of the semester. The results are poor, and I am upset. I return the tests and begin my standard pep talk. I tell them that the reason their grades are low is because they made inadequate preparation. They missed too many classes; on most Fridays, more people are absent than in attendance. They do not know how to take notes.

Sometimes students leave their notebooks behind after class, and when I notice, I read them. I am surprised how often the notebook has no name on it. The notes themselves make for depressing reading. An entire week of complicated and well-thought-out lectures has been reduced to a single page of semi-coherent jottings. I can imagine a chronic absentee copying these notes and further reducing them to three of four sentences. Perhaps if this student lends the copied notes to another and this student to still another, my lectures will eventually be reduced to a single word.

As I warm to my task, I continue to harp about the notes. Students come up after class and question my grading with the explanation that what they had written was what they had in their notes. I say that their notes are theirs, not mine, and what they have in them and what I said may be two different things.

Once, I was lecturing about the workings of a capitalist economy according to Karl Marx. Marx tells us that our economic system is based upon the “accumulation of capital,” the process in which employers exploit their workers to make profits, which are then plowed back into the business so that it can expand in the face of stern competition. Utilizing the story from the Old Testament in which Moses receives the stone tablets from God on which are written the commandments the Jews must obey, Marx says that for the capitalists, “Accumulate! Accumulate! That is Moses and the prophets.” This is such a famous phrase and so well sums up the behavior of business firms that I repeated it a dozen times.

As I said it, I wrote it on the blackboard. But because it is physically painful for me to write, I sometimes did not write out the word “Accumulate” and just wrote the letter A. On the final test of the semester, I had a list of simple fill-in questions. One of them, worth two points, said, “_____, _____! That is Moses and the prophets.” All that the student had to do was write the word “Accumulate” two times on the appropriate spaces. As students turned in their exams, I started to mark them. I noticed that a number of students had answered this fill-in by writing the letter A twice. This began to infuriate me, so when I noticed that the student who had just handed in her exam had done this, I called her back to my desk before she left the room. I pointed to the two As and asked, “What is this?” She looked and without missing a beat told me, “That’s what I have in my notes.”

I rant on about preparation. Preparation must be ongoing, I say. I appeal to the athletically inclined. Can you become a good basketball player or wrestler without practicing? Students will sometimes advise me that they are going to miss an upcoming class. They ask, “Will you be covering anything important today.” Yes, today and every day. Or they will ask, “Do we have to read the parts of the textbook assigned but not covered in class?” Yes, I chose the book to complement the lectures not substitute for them. Why would I come to class if I had nothing to say? Why would I pick a book I thought was unimportant?

By this time the students are getting angry with me. No one cares much for criticism, no matter how true, and especially if the critic’s voice is, perhaps unintentionally, tinged with sarcasm. So, to diffuse their hostility and to make my points less abstractly, I tell them two stories, one about Lenin and one about my old teacher, Father Roderick.

Lenin is a favorite of mine, a man of iron will and determination, who once said that he could not listen to Beethoven’s Appassionata Sonata because it made you want to hug people when what you needed to do was crack them over the head. Nowadays, I have to identify the great Russian revolutionary. Even before the collapse of the Soviet Union, I had a student write on an examination in a comparative economics systems class that the Bolshevik revolution took place in 1967! In any case, I told my students, Lenin had a facility for languages, which he studied during his years in exile and in prison. An admirer asked him how he approached learning a language. Lenin replied that it was simple. First, you learned all of the nouns. Then you learned all of the verbs. Finally, you learned all of the rules of grammar. Just learn everything, and you’ll have it. No tricks. No shortcuts. Just hard work.

Father Roderick gets a longer story. He was my first college history teacher. Few students liked him. Not only was he an impossibly hard grader, but he was also extraordinarily boring. College folklore had it that he had fallen asleep during one of his own lectures. I can still see him pointing with a yardstick at a map of Europe and droning out in his monotone, “By this time, Spain was a third-rate power.” As I am talking to my class, I begin to daydream about those classes from so long ago. There was something about Father Roderick that I liked. Maybe it was because he seemed oblivious to his inadequacies as a teacher. He never seemed to notice our numbed looks, and he never reacted to the audible groans that emanated from us at least once in every class. Perhaps it was because, at a faculty-student “tea” one afternoon, he told me that Eisenhower had been a lousy president. Father Rod was a liberal, and that was all right with me. As was the fact that he was a sports fan. He had been the school’s athletic director, though not a good one, having forgotten to pay the baseball team’s tournament fee the one year the team had been invited to play.

I explain that Father Rod’s tests were devilishly difficult. They consisted of three parts. Part One was a long matching exercise in which all of the terms were so obscure that it was not unusual for some students to recognize not a single one of them. Some of the items were drawn from textbook footnotes and picture captions. Section Two consisted of the “Threes”; we would be asked to give three reasons for this, or to name three of these, and so forth. Further, Rod was enamored with threes, as I suppose all priests are. Part Three required us to write a 500-word essay in answer to a question breathtaking in its generality. One went something like this: “Discuss the political, social, cultural, and economic aspects of the decline of the Roman Empire.”

We had 50 minutes to complete the examination. It was said that Father Rod had not given an A in a long time. And no wonder. You needed 90 percent for an A, and given that you were bound to lose at least 7 to 10 points on the essay, you had no chance for one. Plus, he never rounded a grade up. If you scored 89.9 percent, you got a B. In my first class, I missed an A by a fraction of a point. I became determined to get an A the following term. In fact, I achieved the unprecedented distinction of earning three As in his classes, unprecedented, no doubt, because I am certain that no one ever took four of his classes.

I tell my class about how I succeeded. I explain that I had decided upon a Leninist strategy. Before each test, I rewrote my lecture notes in complete sentences and with insights gathered from the readings. The act of writing the notes helped me understand the material much better. Next, I took the notes and the textbook and made a list of every name, date, and important term in them, including those in the footnotes and picture captions. I then wrote a definition for each of these, a time-consuming task since I might have several hundred entries. But again the act of constructing the definitions greatly aided the learning. I combed through the notes and book one more time, recording every possible “three” I could find, in preparation for Father’s obsession with the Trinity. And last, I made a short list of possible essay questions and wrote out at least an outline answer for each one. I was ready.

My strategy worked. I got an A, something like 97 percent. As news of this spread, classmates began to ask me for help in boosting their grades. Before the next exam and for the next two semesters, students would gather in a dormitory study room and take notes while I lectured from my preparatory materials. Everyone paid attention, because I now knew what would be on Father Roderick’s tests, and my lectures would probably be the difference between a good and a bad grade for my listeners. No one dared interfere with my presentation lest he be shouted down by the others: “Let Mike talk. He has the key to the course.”

By the end of the story, at least the students are smiling. Perhaps a few leave the room with a new resolve. It always seems that the grades improve on the next examination. But most likely it is I who have learned the most. I have put what I learned from Lenin and Father Roderick to work in my teaching. I enter class well prepared. I have come to know the material so fluently that I no longer need notes. I can talk for any length of time about a wide variety of subjects. I can teach in large lecture halls to 200 students or in small seminars. I can do most of the classroom work myself or involve the students in projects of self-learning and discussion. I have had classes in my office, in my living room, in dormitory rooms, and outdoors. I can handle any question, and I can improvise on something I’ve read in the newspaper or seen on television or that simply pops into my mind while I am talking. I have invented hundreds of examples, and I have a reservoir of dozens of stories and anecdotes to clarify and simplify the subject matter. To give myself credibility I have done work in my teaching areas. I have done economic consulting for attorneys; I have been a labor arbitrator; I have helped to organize unions; I have been a negotiator; and I have written widely on topics related to what I teach.

For years, I got the biggest kick out of teaching. It seemed an ideal job, one in which I had about as much control as this economic system can tolerate. I enjoyed putting the lectures together and dramatizing them every day in front of the classes. I felt that I was performing a useful and necessary social task, educating young people about the reality of our society and hopefully giving them a more critical outlook than they had ever had. They could take what I taught them and go out and do good deeds and make the world a better place.

Over the years, however, my love affair with teaching faded and finally ended. I do not give my post-test pep talk very often, and the skilled work of preparing the lectures seems wasted effort. The theatricality of the actual teaching has become rote, something I do because I need a paycheck. I still do it better than most, but then, in my experience, most professors are pretty inept. I have tried to figure out why I have lost interest in my job. The students are a big part of it. Their past “mis-education” and total absorption in consumer culture have made most of them incapable of critical thinking. They want instant gratification and cannot be bothered with the work of learning. College, like high school, is just another hoop they have to jump through to get a job that will pay enough money to keep them in cars, houses, VCRs, cell phones, and all the trappings of middle-class living.

Most of my students are products of the suburban life and do not know enough or care to learn enough to be interesting to me. I still get some kids who yearn for knowledge and some poor adults who know now that it is important to educate oneself. But these few stand out like sore thumbs, and the other students look at them when they ask and answer questions as if they were creatures from outer space. I do what I can for them, but this does not give me the satisfaction it once did.

Whenever I get despondent about work, my wife tells me that I have had an impact on hundreds of students. If I am particularly irritable, I say, “I doubt it.” Then I’ll get an e-mail from someone thanking me for classes I taught long ago. I published a book in 1994 in which I acknowledged Father Roderick. I knew he would never see it, so one afternoon we drove to my old college to visit him. I didn’t know if he was still alive, but someone in the library said that he was retired and living in the monastery. We found him in his spare monk’s room. He didn’t remember me, but he was happy that I had remembered him. He said that he had been happiest when he was out of the college and serving as a parish priest. He missed driving a car. Not long after, he died. Maybe, in inadvertent honor of his memory, a few of my students have learned the method I learned when he was my teacher.

Michael D. Yates
Author's email:

Michael D. Yates is a retired professor of economic and labor relations at the University of Pittsburgh at Johnstown. This essay is an excerpt from his book, In and Out of the Working Class (Arbeiter Ring Publishing).

Return of the Mark of Zotero

When psychiatrists bring out the next edition of their standard reference, the Diagnostic and Statistical Manual, perhaps it should list a new disorder, Social Networking Fatigue Syndrome, since this is bound to reach epidemic proportions by spring 2010. By then, an enterprising digital entrepreneur is bound to have launched something that will be known, at first, as “the next Twitter/ Facebook/ YouTube.” (If that very turn of phrase fills you with paralyzing ennui, there is a pretty good chance you have SNFS.)

Man is, of course, the tool-making animal -- but can’t we maybe give it a rest for a while? Evidently not. At this point we need digital tools to manage all the digital tools we have on hand. One day all of our devices will be able to communicate among themselves (“friending” each other while we aren’t looking) which I’m pretty sure leads to an apocalyptic scenario in which human beings end up living in caves.

And yet, damn it, some of the tools are useful. A couple of years ago, this column pointed out an application that seemed a genuinely useful and non-time-wasting addition to the intellectual workbench. This was Zotero, a plug-in for the Firefox browser, Zotero allows you to gather, organize, and annotate material while doing research online.

With Zotero, you can build up a collection of digital documents, cataloging and sorting it as you go. You can gloss the material so harvested, attaching your notes as you go. Zotero is particularly useful for gathering bibliographical data, and allows you to export it in a wide range of standard scholarly citation formats.

Produced by the Center for New Media History at George Mason University, Zotero was (and remains) free. When I wrote about it in ‘07, enthusiasts were looking forward to Zotero 2.0 -- and not patiently.

Various upgrades became available, but the substantially reworked Zotero was only released six weeks ago, in mid-May. At the time, as luck would have it, I was in a clinic being treated for exposure to more than 400 blog feeds per day. The twitchiness having now abated, I’ve been briefed on the latest model of Zotero by an “information-research specialist,” which is what librarians call themselves these days.

The distinctive thing about Zotero 2.0, now in its beta version, is that it will allow you to store your collection (i.e., digital document archive, plus notes, plus bibliographical data) on a server, rather than on your hard drive. This has at least two important consequences.

The first is that you can add to your Zotero files – or retrieve them – from any computer with web access. The old version stored the data on whatever machine you happened to be using at the time. I have a laptop somewhere in my study, for example, that contains records gathered last year ago, but not available to me at the moment because I am not exactly sure where that laptop is. Once I find it, however, it will be possible to ship this data off into “the cloud.” That means I can synchronize my old laptop, our household desktop computer, and the netbook I do most of my writing on now, so that the same Zotero files are always available on all of them. This was possible with the earlier version, but you had to make a point of transferring the files, which evidently I never got around to doing.

The other major development is that Zotero 2.0 allows users to create groups that can share data. Members of a class or a research group are able to transfer files into a common pool. (So far, it is possible to do this with bibliographical references but not with documents, though the Zotero people are working on finding a way to store the latter.)

You also have the option of creating a sort of haute Facebook presence. Dan Cohen, the director of the Center, explains: “Zotero users get a personal page with a short biography and the ability to list their discipline and interests, create an online CV (simple to export to other sites), and grant access to their libraries.”

Thanks to such profiles, it should be easier to find other researchers who share your particular interests, and so engage in the cooperative exchange of references and ideas -- at least, assuming your notion of the life of the mind is not that of a zero sum game, or indeed of bellum omnia contra omnes. It will be interesting to see how that shakes out, discipline by discipline, sub-field by sub-field.

Efforts to forecast the long-term effect of scholarly social networking (not only on research but on the institutions and mechanisms of academic publishing) tends to produce either science fiction or hyperventilation. I’ll try to stop short of either and just move along quietly to another, slightly tangential development that came to my attention in the wake of Zotero 2.0.

It is a digital tool called Webnotes. At first blush it seems a little like Zotero except that it isn’t free. Rather, you can get a bare-bones version for no charge, but the fully equipped model runs at least $10 per month. The company gave me trial access to Webnotes, which was then reviewed with the expert assistance of my in-house information-research specialist. (Reader, I married her.)

As with Zotero, Webnotes enables you to download digital documents and to create files to organize them. But it lacks Zotero’s really impressive capacity to recognize, absorb, and flexibly rearrange bibliographical references.

It makes up for this, to some degree, by allowing you to highlight and make notes on a wider range of digital documents than you can through Zotero alone – in particular, PDFs – and to do so with great ease. The user is able to attach the equivalent of a post-it note to a given word or passage in a document. These annotations are then listed in the index for your collection, with a link that allows you to go directly to the passage and note in question. The design also makes it easy to email the glossed document.

All of this would be particularly helpful when making (or requesting) revisions in a manuscript, for example. Its value as a tool for researchers is rather more limited, though it might be helpful to people who deal with relatively small numbers of documents that come without metadata, or who don’t need to prepare bibliographies.

Another noteworthy aspect of Webnotes is that, unlike Zotero, you can use it in browsers other than Firefox. I tried it out in Internet Explorer, which may have been the first time I've touched IE in years. (The folks at Zotero have indicated that they are working on ways to make their product work with other browsers -- but the April 1st dateline of this announcement bears noting.)

The ability to highlight and annotate passages in PDFs is very welcome. It is possible to do this without Webnotes provided you have the “pro” version of Adobe, but I don’t, and found this feature appealing. As an acid test, I used Webnotes with a document that had been scanned from a fifty-year old mimeographed text and found that it worked just fine.

Obviously this is a feature that would be valuable to see incorporated into future incarnations of Zotero, but for now I’m glad to have the use of Webnotes as well.

Scott McLemee
Author's email:

Occupational Hazard

“The students said they don’t do paragraphs anymore. They insisted that I let them do PowerPoint,” the professor regaled his colleagues. The room of instructors stopped chewing lunch to chuckle.

“So the students can’t write,” he went on, “I guess we have to live with that and I told them they could use bulleted slides. I assigned a chapter with five short answer questions. And you know what?” He threw up his manicured hands. “They can’t read either.”

The room rumbled with laughter. My gut grumbled as I waited for enough chicken salad to disappear before I could start my talk. When faculty members discuss teaching, stories about ill-prepared, unmotivated, and ungrateful students bubble up like swamp gas. Always this talk ends in gallows humor. Everyone laughs and walks off with the same unspoken phrase – “How are we supposed to teach these creatures?”

Who hasn’t heard that message? When my patience thinned as failures multiplied, I shared my student stories, enjoyed the laughs, and walked away feeling like I’d been playing in the dirt. These moments happen. Students resist learning. They are sometimes cunning, incompetent, threatening and privileged. The worst scenarios get passed around like bawdy postcards.

The practice corrodes our craft. You can’t be a competent and successful teacher until you throw it out. Most good teachers can tell you when they gave it up. But truthfully it takes time to kick the cynicism monkey.

On an early morning walk many years ago with my neighbor, a journeyman carpenter, I expounded instructor funny bile for maybe a mile. My companion began shaking his head even as he laughed at my riffs on preposterous excuses and dumbfounding laziness. Finally, he said, “You know I could never be a professor. I don’t know how you do it.”

I waved off the acknowledgment of our heroic task. “You get tough and you learn to laugh.” His head shook. “No, no. I’ve listened to your jokes and complaints about students for a long time. I feel sorry for you.”

I slowed the pace. “Every day,” he went on, “I build houses. The studs are never quite straight; the nails are imperfect and the plans mistaken. Contractors screw up schedules, suppliers deliver late, clients change their plans -- I could complain about these blunders every day but I’d never build anything.”

I flushed as I saw myself through his eyes – a crabby professor, always with a funny student story flavored with blame. The jokes hid a deeper problem. I saw students in terms of their deficits, not mine. They couldn’t construct or evaluate arguments; fathom an author’s conceptual framework; read for connections and patterns; write engaging and vibrant prose; and most of all bring knowledge of culture or history to their learning. They were impossible.

Seen that way there was no way to teach them. I thought about my neighbor’s example. His materials and conditions weren’t perfect but he continued to learn to be a better carpenter.

Blame the student stories stopped on our walks that day. My students weren’t perfect but they were all the materials I had. I couldn’t do my job without them. In my head there was a new rule – the students are the stuff with which you work. You can’t blame them. If they don’t learn, you haven’t taught well enough. To follow that rule was hard.

My colleagues thought I had become an “idealist” – a polite way of saying, “patsy”. They knew that so many outside things caused student failure – high schools, the media, computer games, and all the other flotsam of ignorance – it was beyond their control. By taking responsibility for all those failures wouldn’t I doom myself to flagellation?

At first, it wasn’t so bad. I organized past data on grade distributions by topics, assignments and schedules. Performance always went down in the seventh and eighth weeks of the semester no matter what I did. Investigating assignments I found learning failures caused by my mistaken assumptions.

Unease developed as I got deeper into the details. Students dropped more often; hostility grew in the classroom. It came to a head one day when Tony bristled into my office. He was just the kind of student – always curious and fermenting with ideas and questions – that delighted me.

Tony said, “I can see what you are trying to do. We need skills and practice. “But,” he stared at the floor to hide the embers in his eyes, “does it have to be so awful? Can’t we ever feel good? Must we always hear about mistakes?”

My ready answers – learning is hard, the early stages confuse, and you have to practice even when you hate it -- stuck in my throat. A fine student was miserable and something was wrong. As we argued, I began to see my mistake.

Learning for me was about disciplined practice and correcting mistakes. But Tony saw learning as curiosity, questions, and triumphant answers. My version had no emotions to mush things up. Tony thought it was drab and joyless; based on fear and shame. Tony’s version could keep him working for hours. My version made it hard to even get started.

Mistakes are mistakes, I told myself, but that didn’t alleviate the gloom of failure. I remembered my infant son determined to walk and falling down, getting up and falling down; sometimes crying and sometimes smiling as he swayed upright. What could drive that relentless resolve to learn but desire? All learning, I began to see, was ignited by emotions. Without them, classrooms were barren.

That insight forced me to see students not as deficits, but as knowing people with potentials that I could not imagine. As a result, my courses did not get easier for me or the students; but they ignited with energy and occasional bursts of joy.

The great teacher of basketball, John Wooden, once said you aren’t a loser until you blame others. I thought that was a moral judgment. It wasn’t. Wooden meant that if you blame others you can’t learn. And I would add if you can’t learn you cannot teach.

Larry D. Spence
Author's email:

Larry D. Spence is a learning innovation consultant at the Smeal College of Business of Pennsylvania State University.

Against Anonymity

The history of English literature is replete with authors who hid their names from their audience. Should the 21st-century academy also follow in this tradition? Jonathan Swift, for example, published all of his satirical writings without revealing his identity to his audience. In the case of his late masterwork, Gulliver’s Travels, a work — as he wrote in a famous letter to Alexander Pope on September 29, 1725 — designed “to vex the world rather than divert it,” Swift went so far as to have an intermediary deliver a sample part of the manuscript to his publisher. Should contemporary academics follow Swift’s lead and publish all of their critical writings anonymously? Should they even put their name on critical assessments of their colleagues? What role, if any, should anonymity play in the contemporary academy?

While Swift may be a more elaborate case of an author wishing to deceive the public — and his publisher — as to his true identity, he was far from the only famous English writer to mask his or her identity. Spencer, Donne, Marvell, Defoe, Fanny Burney, Jane Austen, Byron, Thackeray, Lewis Carroll, Tennyson, George Eliot, Sylvia Plath, and Doris Lessing are some of the writers who published works that did not reveal their true identity to their audience.

Given that these authors and their works are some of the cornerstones of the English literary canon, one might wonder why these authors did not put their names to some of their greatest works of writing. The simplest — and probably best — answer is fear: fear of things like persecution or imprisonment. This link between fear and anonymity is now carried on in the academy. Though fear of persecution or imprisonment might not persuade academics today to hide their identity from their audience, fear of retaliation surely does. Some academics fear that negative — or even positive — comments about their colleagues will lead to retaliation from them.

This, in turn, leads many contemporary academics to voice their comments from behind the veil of anonymity. Or, worse yet, convinces them to alter their comments because they are not anonymous. The practice of anonymous critical assessment is relatively common and widely accepted in the academy. So too is the understanding that some non-anonymous comments by academics may not be reflective of their true opinions.

Here is an example of how the logic of academic fear works with respect to non-anonymous comments: Professor Jones is asked to write a reader’s report on the worthiness for publication of Professor Hill’s manuscript. Jones reads Hill’s manuscript and believes that it is a “weak” contribution to the field, and believes that it should not be published. However, the potential publisher of Hill’s manuscript tells Jones that his name and his comments will be passed along to Hill. Jones fears that if he writes a negative report of Hill’s work, that Hill will in turn retaliate against Jones by writing a negative report of Jones’ work should the opportunity arise. Therefore, Jones decides to write a “less negative” or “more positive” report of Hill’s manuscript.

A variation of this example involves the use of anonymous comments, rather than non-anonymous ones: To avoid a “dishonest” report by Jones, the publisher assures him that his comments will be kept anonymous. Therefore, Jones knowing that Hill will not discover his identity decides that he will provide a more honest assessment of the manuscript.

There are of course many variations of these scenarios including a totally anonymous interchange: An anonymous manuscript stripped of all indicators of the identity of its author is read by a reviewer whose identity is in turn not revealed to the author. Moreover, the logic of academic fear does not end with peer review of manuscripts. It is not uncommon, for example, for the identity of students to be hidden in the assessment of their professors. It is also not uncommon for the identity of faculty to be hidden from administration in their assessment of them.

Should the identity of individuals within the academy be hidden so that they may speak freely (and honestly) about each other? Or is the practice of anonymity within the academy contrary to the aim of the academy, namely, the free and honest exchange of ideas and opinions in pursuit of knowledge? Is increasing the frequency and range of anonymous assessment in the academy in its best interest? Or does encouraging anonymous assessment within the academy promote a less collegial and more thin-skinned academy? These and similar questions raise significant issues that have not received a whole lot of attention.

The prevailing wisdom seems to be that anonymity is an accepted and acceptable practice with positive implications for the academy. Many seem to believe that like the history of English literature, where anonymity has provided writers with the opportunity to express their ideas and opinions without fear of retaliation, the history of the contemporary academy will show that anonymity has been a positive, liberating practice. However, such a history of the academy would be nothing more than a fairy tale.

I believe that anonymity is the enemy of dialogue, and, as such, should have a fairly limited role in the academy.

Dialogue in Academe

Dialogue in academe involves the free exchange of ideas and opinions. It requires that the interlocutors respond to each other and know to whom they are speaking. It is difficult, if not impossible, to have a discussion or debate when one of the discussants or debaters does not participate. Furthermore, it is not possible to have a discussion with no one — some minimal level of identity is required for those who participate in dialogue.

Differing ideas and differences of opinion make the academy a vibrant, living, organic entity. Within the academy, individuals are expected to defend their ideas and opinions to the best of their ability. The question and answer rhythm of dialogue tempered by rational reflection assures that no idea or opinion passes through the academy without proper critical assessment. Knowledge of the identity of the interlocutors allows for proper and relevant questions to be asked — it also allows for questioners and answerers to be accountable.

The aim of dialogue in the academy is not merely to just state or assert one's opinion. Simply asserting one's opinion to one another has more in common with two dogs barking at each another than two people engaging in dialogue. The aim of dialogue is rather to defend or argue for one’s opinion against any and all objections. The ideal here is the type of dialogues fictionalized in the early writings of Plato. An opinion that cannot be defended against objections has no place in the contemporary academy, just as it had no place in the ancient Greek academy.

Once a view or position is stated or expressed, the expectation in the academy is that through dialogue between the source of the opinion and other members of the academy, particularly students and faculty, stronger ideas will continue to be discussed and debated, and weaker ideas will gradually fade from conversation — and be dismissed. Without the free exchange of ideas and opinions in pursuit of knowledge, the academy perishes — or at least loses part of its distinctive character.

The free exchange of ideas and opinions in the academy assumes that the aim of this free exchange is the dissemination and production of knowledge — not opinion. Discussion of all opinions and ideas in an environment that is conducive to separating the defensible opinions from the indefensible ones is essential. This means that those who share their opinions with others should not fear negative consequences or reprisal for doing so. This does not mean, however, that individuals who promote bad ideas or opinions should not be identified with them. On the contrary, if one promotes an idea or opinion which is eventually dismissed through academic dialogue, this (bad) opinion or idea becomes part of one’s academic identity.

While the continued identification of an individual with his or her bad idea or opinion within the academy may seem like a harsh statement, it is ultimately a fair one. Just as individuals that promote sound ideas and opinions continue to be identified with them, so too should individuals who promote unsound ideas and opinions be identified with them. Take for example the philosopher/scientist Albert Einstein.

Early in Einstein’s career, his notions about relativity were met with disapproval by his colleagues and members of the academic community. However, this negative reaction from his colleagues only encouraged him to find stronger grounds upon which to convince them that his ideas about relativity were valid and sound. Ultimately, he prevailed in the critical dialogue about his ideas, and was awarded with not only the Nobel Prize, but the respect and ear of the academic world. If Einstein had not prevailed, his identity as a person of intelligence and compassion would not have been secured — and his subsequent ideas and opinions would have had far less visibility. Later in his life, the world listened intently to Einstein’s comments about things other than physics because they were his ideas. Epistemological affiliations which result from critical dialogue are powerful.

Just as one tends to benefit positively from the successful defense of an idea, so too should one benefit negatively from the unsuccessful defense of an idea. Trust and power in the academy is earned, not randomly bestowed upon individuals. It does not mean that one examines any less vigorously the ideas of Einstein as opposed to someone who has never successfully defended an idea — let alone a revolutionary one. Rather, it means that those whom we value as voices of wisdom in the academy have earned this through past argumentative success, not through dialogical failure. Continued failure to successfully defend one’s opinions and ideas should not be rewarded by the academy; rather, it should be met with increasing skepticism as to the academic abilities of the individual — and possibly even dismissal of the individual from the academy, particularly if there is no record of academic (or dialogical) success.

Anonymity Is Anti-Dialogue

The academy has an obligation to protect itself from anything that enervates critical dialogue and to encourage anything that energizes it. While repeated failure to defend ones ideas and opinions is not admirable, it is also not reprehensible or unexpected. Publication, tenure and promotion, merit pay, and respect are some of the major ways in which the academy rewards dialogical success (and the denial of these are some of the ways it “punishes” failure). How then does anonymity figure into this picture? In particular, how does it contribute to the promotion of dialogue in the academy?

A common response is that anonymity is of benefit to the university because it protects individuals against reprisals from other members of the academy. In particular, anonymity is utilized to assure the free exchange of ideas and opinions in the academy. If one’s identity is not revealed in the stating of an opinion, then one cannot be the recipient of any negative (or positive) consequences from said opinion. In this climate, anonymity functions like the mythical Ring of Gyges: academics can say and do whatever they wish without fear of direct negative consequences for their actions. But does the normal operation of the academy require a Ring of Gyges? I don’t think so. In fact, I would go one step further and maintain that anonymity is also ultimately not in the best interest of the academy.

Anonymity affords one the opportunity to speak without having to be accountable for the consequences of one’s speech. Anonymous propositions do not draw their power from the reputation of the speaker. Rather, they draw their power from the fact that they could be from anyone — from their disaffiliation. They speak for and from everyone and no one. Once the identity of the source of the anonymous proposition is revealed, they become part of the world of dialogue, affiliation, and situated discourse; however, if the identity of their source is never revealed, they have a decidedly halting effect on discourse.

Anonymous propositions are fundamentally monological, not dialogical. Whereas a proposition that is attributed to or affiliated with someone always contains the possibility of being questioned or interrogated, anonymous statements cannot be questioned because there is no one to ask. Because anonymous propositions are not subject to the process of question and answer, they cannot speak back or respond to requests to explain themselves (because their source is unknown). These epistemological limitations make them a weak source of knowledge. Furthermore, anonymous propositions discourage human interaction and inquiry.

Students commonly evaluate their professor’s performance in the classroom anonymously. The alleged reason is that students will be more honest in their evaluations if they know that the professor will never be able to track the comment back to them. The fear is that if a student says something negative about their professor, then the professor will retaliate by either giving the student a lower grade in the course under evaluation or the next course that the student takes with the professor. Though this may be a valid point — some professors are vindictive — the solution to this problem is not anonymity, but rather professional reprimand.

Furthermore, comments from students under the guise of anonymity are generally not very instructive. “Professor Jones is the best teacher ever,” writes the student. Jones is then credited with superior teaching performance on the basis of this comment. But consider some possible reasons that the student might have written this comment: 1) The student is a “C” student, but because Jones is an easy grader, he or she received an “A” in his course; 2) The professor conducted class by passing out cookies and Kool-Aid each day, and told stories about his college days; 3) Professor Jones canceled the final exam; 4) Professor Jones challenged his student to master the material and an otherwise “A” student earned a “C” (and the kudos of the student); or 5) Professor Jones is the best teacher ever (outshining both Einstein and Socrates). These reasons may be multiplied many times but lead to the same conclusion: an anonymous student comment is not very informative. Furthermore, it begs for the question and answer process that dialogue provides to get at the reality of Professor Jones’ classroom performance. In other words, one needs to know a lot more about the identity of the student in order to determine the validity and soundness of their comment. Dialogue would provide this; anonymity denies it.

The case of anonymous peer evaluation is not much better in terms of dialogic effect. In her article, “Seven Faces of Anonymity in Academe," Lynn Bloom speaks to this issue directly: “I never sign my reviews. This is not a dialogue. I do not what to know your rebuttal, though I am willing to read thoroughgoing, thoughtful revision. I do not want to know the impact of my review on your life, professional or personal, or on your feelings. I am providing a Service to the Profession, even if it means keeping you out.”

Bloom is of course right. Anonymity in manuscript review allows reviewers to disengage from dialogue. It of necessity keeps the author of the manuscript outside of the dialogic process. She is also right in that reviews impact lives: they are not singular events for the manuscript’s author even though they may be for the anonymous manuscript reviewer.

To consider Bloom’s comments fairly, one must distinguish between anonymous reviewers reviewing anonymous manuscripts, and anonymous reviewers reviewing manuscripts where the identity of the author is revealed. Because the former short-circuits the potential of dialogue by not including any identity markers in the review process, one can maintain that this process is not as problematic as one wherein the reviewer knows the identity of the author but the author does not know the identity of the reviewer. The former may be called the “totally anonymous review process,” and the latter the “partially anonymous review process.”

With regard to the partially anonymous review process, it does not make a whole lot of difference if the reviewer or the manuscript author’s identity is not revealed. The same problem holds for each situation — a problem that parallels the case of a student who knows the identity of the professor but the professor does not know the identity of the student (or vice versa). Just as student evaluation under this condition is problematic, so too is the partially anonymous review process.

When the identity of one of the two parties in the review process is revealed, the possibility of dialogue takes hold. The partially anonymous review process is by definition failed dialogue. It is akin to an unidentified person yelling a comment from behind a curtain to a person who is standing before a crowd. The comment is “Professor Jones is a poor scholar!” Rightfully so, Jones and the audience will wonder about the identity of the person making the comment, and want to ask more in depth questions of them. It makes a difference if the person yelling the comment is a respected scholar in Jones’ field of expertise or a young man who has not even graduated from high school. The partial review process leaves lingering questions of identity that the totally anonymous review process avoids. While one can ask questions as to why an anonymous reviewer failed to see the power of an anonymous manuscript they reviewed, the doubt that is raised by an anonymous reviewer of a non-anonymous manuscript raises an entirely different set of questions, such as does the reviewer have a personal or professional problem with the author? And so on.

All of the scenarios of anonymous assessment mentioned above share the same effect, namely that of short-circuiting dialogue. If one believes, as I do, that dialogue is an essential feature of academia, then one must conclude that anonymous assessment is antithetical to the very idea of the academy. While one might be able to live with a less than ideal academy which extensively utilizes anonymity but is otherwise collegial, one would not be able to live in an academy that is uncollegial in part as an effect of anonymity. Let’s now briefly turn to the effect of anonymity on collegiality before establishing a conclusion on the role anonymity should play in the academy.

Anonymity, Collegiality, and Critical Judgment

The common rationale for academic anonymity is quite clear: if one were required to accompany one's assessment with one’s true identity, one would not speak the truth.

While I think that a case may be made that total anonymity is fairer than partial anonymity, the fact remains that anonymity by definition halts dialogue. Imagine, if you will, an academy where all were required to reveal their identity when they voiced their opinions or ideas. Further imagine that in this ideal academy everyone was expected to think about and respond to the ideas and opinions of others: Where critical dialogue was the norm, not the exception. Would this not be a desired situation?

Part of the problem with academia today is a fear and avoidance of critical judgment. Some even believe that criticizing the views of another is a fundamentally uncollegial (if not also unethical) act, and that uncritically supporting a colleague is a collegial act. Academia has created a culture and ethics of uncritical consent and has hidden it behind the cloak of collegiality. Uncritical kindness in response to any and all of our colleagues’ ideas and opinions is not only fundamentally uncollegial; it is also an abdication of academic responsibility.

The only reason that anonymity is so prevalent in academia today is that as academics we have forgotten that critically thinking about the opinions and ideas of others is what we do — or even what we do best. It is the task of the academic to be critical. Anonymity breaks down the critical dialogue that brings us together into a unified profession in search of answers to our questions—and questions to our answers. It atomizes the profession in a way that sets one individual and their ideas against the other in personal (not epistemological) competition for superiority. According to the popular logic of anonymity, academia is “nasty, brutish, and hard — a war of all against all” for position and power.

Collegiality requires dialogue. If we set up practices in the academy that prohibit or prevent dialogue, then we are fundamentally enabling uncollegial behavior. Partial anonymity, at least, is always already uncollegial behavior; and total anonymity, while less pernicious than partial anonymity, still discourages dialogue, and therefore is also fundamentally uncollegial.

Part of the reason that many people have difficulty sharing critical judgment with others in the academy is that we have created a culture where critical judgments must be protected from fear of retribution. If one were forced to reveal his or her identity in all critical judgment situations, individuals would probably take more time to develop their arguments knowing that they could become themselves the object of critical attention. While it goes without saying that it is wrong to be the subject of retaliation for one’s ideas and opinions, it also goes without saying that the culture of retaliation is encouraged (rather than discouraged) by the widespread use of anonymity.

It is acceptable within the public sphere to anonymously share an opinion and not have to answer to questions or concerns raised about it. However, within the academic sphere, this is simply not desired behavior. If one, for example, posts on his or her blog a statement concerning one’s belief in gremlins, one is not obligated to respond to persons who disagree with this statement. However, in the academy, students, faculty, and administration are expected to answer to questions about their opinions. When sharing an idea or opinion with ones colleagues, one should expect — and hope — that they respond to it. This is what being a colleague means.

Conclusion: Against Anonymity

The widespread use of anonymity in academia should be a cause for concern, not celebration. If dialogue is the warp and woof of the academy, and collegiality demands dialogue, then the use of anonymity in the academy should be discouraged. Though individual cases might be made where anonymity is justified in academia, these are far from the norm. The norm in academia should be against the use of anonymity, rather than for it. Increasing the role and scope of anonymity in the academy serves to make it less collegial and more fractious by discouraging dialogue as the major source of critical assessment. The academy must work toward the elimination of all forms of anonymous critical assessment — lest it lose the characteristic which most distinguishes it from other institutions: the free exchange of ideas and opinions in pursuit of knowledge.

Jeffrey R. Di Leo
Author's email:

Jeffrey R. Di Leo is dean of the School of Arts and Sciences, and associate professor of English and philosophy at the University of Houston-Victoria. He is also founder of the journal symploke, were a longer version of this essay will be part of an issue devoted to the topic of anonymity.

Too Much Information

Too much information.

Those words cross my mind whenever students feel compelled to explain in too-intimate detail reasons for absences, for late or substandard work. I have certainly not heard it all, even after 30 years of college teaching, but what I have heard often makes me queasy. Just how much should I know about unfaithful boyfriends, sadistic football coaches, insane roommates? And should I really be entrusted with accounts of parental divorce, unexpected pregnancy, arrest, drug or alcohol abuse?

The weasel’s answer is “no and yes.” Freshmen often share a great deal, either on the page or in office conversations, and those of us who teach them, especially in composition classes, probably need annual orientations and debriefings from campus counselors. Some even argue against the common writing-course practice of requiring students to keep journals and to share personal narratives.

Patrick Allitt, an Emory University professor, goes further by declaring in I’m the Teacher, You’re the Student that any utterance of the word “boyfriend” or “girlfriend” during office hours provokes the following response from him: “I can’t talk about your personal relationships with you, and I can’t permit you to mention them. If you are too upset to discuss history you must leave now and come back when you are able to do so.”

Yes, I try to wrap myself in a pedagogical raincoat as protection against students’ often messy lives. I issue explanations about the classroom’s public nature and warnings against spilling dark secrets. Still they confide, and in the face of muted cries for help, seldom provoked by anything in the course requirements, my own professorial reticence must yield. And not wanting to seem like a character from a Monty Python skit, I don’t find much help in Professor Allitt’s approach. Hearing them out, empathizing, though, is as far as I like to go, viewing my job as sending -- even escorting -- students to the right places and people, to the trained professionals.

What happens, though, when the tables are turned? How much should a faculty member -- in fact, this faculty member -- share with students about his own personal crises? How much do they need to know? When does forthrightness slide into exhibitionism?

For me these questions became more than academic when I was diagnosed with prostate cancer.

I wasn’t shocked by the November pathology report, having undergone biopsies the year before, and plunging into oceans of information on prostate cancer pushed fear mostly beneath the surface. My first instinct was to hold off on surgery until the academic year was over. I could then use the summer for recovery. I even relished the thought of my contribution to the interminable How-I-Spent-My-Summer-Vacation agenda item at our opening department “retreat.” Alas, that sardonic delight carried little weight with my urologist and none with my wife.

Just three days into the term I met with a University of Michigan surgeon -- who looked as young as most of my students -- and decided upon a robotic prostatectomy. My surgeon seemed to have grown up with touch pads and joysticks, the perfect physician, I thought, to operate on me from a console across the room. Everything was soon set, at least on the medical side.

The million-dollar da Vinci robot, under his expert guidance, would remove my cancerous prostate on the Friday before spring break.

All along I’d hoped to start the term fresh, with as clean a slate as possible. Not only was I coming off a productive sabbatical, but several years of faculty senate leadership when I taught just a single course per term. Now, scanning the class lists, I recognized almost none of the names. Just when would I tell these strangers about my forthcoming absences? To make a beginning-of-the-term announcement, I thought, could provoke in them much anxiety and uncertainty.

Because my teaching credo has always centered on students embracing the course’s subject, not the professor who delivers it, I decided that these classes simply could not begin with a piercing spotlight aimed my way. No dramatic medical confessions delivered while they gripped still-warm syllabi, no “I’m Professor Franciosi and I have cancer.” Even the likely success of my treatment made me uneasy. While teaching a new course on Holocaust memoirs, did I want to be labeled a “survivor”?

My plan, then, was to make sure the courses were well-launched for a month or so before telling students about my forthcoming absence. What I offered in a too-reticent and, I now see, over-mysterious fashion was an announcement that a scheduled surgery would cause me to miss two or three weeks after spring break. I explained that my wife, Jo Ellyn, would take over for me, and I shared her academic credentials. During the week before my procedure she even attended the classes.

My vagueness, of course, left the dirty work to her. While I was home convalescing, she addressed matters head-on. No doubt they’d wondered about the unspoken cause for my absence, about this nameless “surgery” and its seriousness. She explained that I had had a prostatectomy to treat early-stage cancer, adding that all my pathology reports were excellent and I would certainly be returning in the near future.

Besides some cards and e-mail general good wishes from students, I heard from one whose father had been treated for prostate cancer. In preserving my privacy, I realized, I had deprived her of the kind of generous counsel that I’ve tried to model as a teacher/mentor. The loss was more than just my own. Even so, I knew my emotive limits, the boundaries in the student-faculty relationship that I simply would not, perhaps could not cross.

I was home recuperating when author Sandi Wisenberg visited our campus. Co-director of the MFA program in creative writing at Northwestern, she had caught my attention a few years ago with an essay collection, Holocaust Girls. Less than a week after my surgery I did an e-mail interview with a student reporter for a piece on Wisenberg’s appearance, commenting in detail on why the Holocaust still draws so much attention.

None of my comments were used; instead, the reporter focused on Wisenberg’s latest book, The Adventures of Cancer Bitch, which I soon learned was based on her Cancer Bitch blog. The heading for that ongoing Web diary grabbed me by the sweatshirt collar: “One Feminist's Report on Her Breast Cancer, Beginning with Semi-Diagnosis and Continuing Beyond Chemo. You don't have to be Jewish to love Levy's rye bread, and you don't have to have cancer to read Cancer Bitch.”

In fact, that blurb doesn’t do justice to the wide-ranging, provocative, and courageous blog. Consider just one feature: a set of photographs runs along the page’s right side with images of breast-shaped food from Wisenberg’s “Farewell to My Left Breast Party.” They are interspersed with others in which her head is being shaved, an inevitable chemotherapy ritual, and then decorated with peace signs and the words “U.S. out of Iraq.”

Reading the intriguing blog’s revelations and information, though, scared me straight. I could never share so much with my students. Yet Wisenberg’s sense of humor, aptly compared to that of Fran Liebowitz, spoke to me as my return to teaching drew closer. I began imagining fantastic ways of pulling it off. One involved my wife pushing me on a wheelchair into the classroom. I’d be wrapped in a blanket and weary-looking, only to spring up from my seat, clad in a tuxedo and shouting like Gene Kelly in Singing in the Rain, “Gotta dance! Gotta Dance!”

When the time finally came, my return to the classroom was, as T.S. Eliot once said, not with a bang, but a whimper. I eased back on a Friday, when the Holocaust memoirs course was my only class and my wife, Jo Ellyn, and I could share the teaching duties. Actually, I only covered the last 15 minutes. At the hour’s end I thanked the students for their thoughts and, most importantly, their willingness to carry on without me. Then I asked them to join me in thanking my more than able substitute.

After the applause died down, they shuffled out into the afternoon light of an early spring Friday. Jo Ellyn and I drove the 30 minutes back to Grand Rapids and lunched at a restaurant a few blocks from home. I had built up the drama of my return far more than necessary, just as I had over-thought what to share with my students, and when. Two minutes after walking through the front door at home, I was ready for a brief nap. They hadn’t laid an emotional glove on me, I thought.

Then I slept for almost four hours.

Rob Franciosi
Author's email:

Rob Franciosi is professor of English at Grand Valley State University.

The Death Guy

I want to stop thinking about death. I want to stop writing about death. I want to stop being interviewed about death. But people keep dying, and the dead keep returning, and I’m not getting any younger, and everyone is trying to figure out how best to live with death, especially us aging baby boomers whose religious appetites are no longer fully satiated by traditional religions.

I have written two books on the history of death, The Sacred Remains: American Attitudes Toward Death, 1799-1883, and Rest in Peace: A Cultural History of the Death and the Funeral Home in Twentieth-Century America, and my new book, Sacred Matters: Celebrity Worship, Sexual Ecstasies, the Living Dead, and Other Signs of Religious Life in the United States, ends with a chapter entitled “Death” (which closes with discussion of two familiar ghost gods, Elvis and Tupac).

I used to joke with people after each of the first two books that “Now I’m really done with death,” and they would laugh and I would laugh and we would move off the uncomfortable topic quickly. But somewhere deep down I knew the superficial levity could not squash the serious truth: I won’t be done with death until Death has done me in.

For the last decade or so I have taught a death and dying course at Emory University. When I first offered the course there were 8 or so students; last year when I taught it, 60 students enrolled. As a teacher it is supremely gratifying to know when you’ve truly nailed a class, and this class is a blast for me and for the students, who come in blissfully ignorant of the varied, confusing, strange range of experiences of death through time and across cultures. We cover Hindus and Buddhists, Muslims and Jews, and really dig in to death in Christian history (the cult of saints is a winner every time), before finally turning to the American context. The connections between death and religion in popular music and films, histories of medicine and race, funeral homes and virtual space, as well as in other social experiences, blow their minds and expand their intellectual horizons. They also teach me a thing or two each year.

It works every year and yet for some misguided reason I thought it would be good to try something new for a change, and so I decided to teach religion and sexuality next term. I imagined it would be an easy shift from the one topic to the other. Now that I’m preparing the class, I’m already longing for the death lectures, site visits (Emory Hospital morgue; local funeral home; Oakland Cemetery; you get the picture), and lugubrious images. Ironically, I’m not too sure how to spice up the sex class.

Interviews with journalists and public writings on the topic have been a staple of my professional life for the last several years, even though I initially resisted getting dragged into media frames as a young assistant professor when Princess Diana died in the summer of 1997. I foolishly assumed it would be disrespectful to comment on such a tragedy in the white heat of the moment and, quite frankly, that it would lessen my chances for promotion to tenure since many faculty in the old days questioned the value of public scholarship. But to this day I regret not talking since the media missed so much of religious valences and values of the public’s response. When John F. Kennedy, Jr. died a few years later and the media inquiries began, I did not hesitate to talk about the public mourning and widespread sorrow, and even ended up appearing on NBC News, which made Emory and my parents ecstatic.

Through the years there have been scandals (don’t get me started about the Tri-State Crematory horror in North Georgia a few years ago) as well as a surprisingly consistent media interest in the changing shape, contours, and content of funeral practices in America. In the last few weeks, Michael Jackson’s death and the Burr Oak Cemetery set off a frenzy of media calls, blog posts, and radio interviews. The media preoccupation both reflects and fuels the public’s fascination with these deathly matters. So, when death is in the news, who are you going to call?

Still I have to confess I feel a cringe (tinged with nausea) when I hear someone say “you’re the go-to expert on death rituals.” Not exactly a badge of honor, but it could be worse, right? I could be an expert in, say, Protestant theology, or Reform Judaism. While it’s easy to complain about my lot in life, there is no doubt that death has been very good to me and brought me an unexpectedly rich livelihood.

Unfortunately, the older I get the less comfortable I am with mortality in general. It is difficult to engage with death “in theory” when more and more of my friends are struggling with parents who are dying, when people my age are dying off from cancer and other ailments, when just about every day another obituary for a public figure shows up in the news to remind all of us that death is indeed the Great Equalizer for the famous and the fameless, wealthy and poor, fortunate and unlucky.

I do take solace in knowing I’m not the only one with these morbid obsessions. A brief spin through the channels on the television confirms it; a careful perusal of movies at the video store proves it; even the music we listen to speaks to it. The first line of the last song on the new Wilco CD, Wilco (The Album) sums it up beautifully and brutally: “Everything alive must die.”

So how do we live with death? A question as old as human life, of course, and a question whose urgency changes over the course of a lifetime. The older I get the more people I know die, a realization as disturbing as it is enlightening. But also a reality that in some strange way can bring solace and wisdom and despair and clarity and confusion. In other words, a mixed though challenging assortment of possibilities that would make a great topic for an op-ed.

Gary Laderman
Author's email:

Gary Laderman is chair of religion and professor of American religious history and cultures at Emory University. He is executive editor of Religion Dispatches.

Furlough Friday

My partner Georg had his first of 14 “Furlough Days” yesterday. As a Cal State faculty member, Georg’s furlough days are an indication of things to come for me. I teach Chicano studies, and I am required to take two furlough days per month in the coming 2009-10 academic year. So what did we do? We shopped, ate lunch, and went to a movie. As Northridge's president said, “Let me simply … add that a furlough means that you don't work. That you're not supposed to work. That we don't expect you to work.” So we didn’t. We saw Julie and Julia, a story about a woman (about to turn 30, for some reason, a moment that causes her to reflect on the unfinished things in her life) who blogs her way through cooking Julia Child’s classic recipes. While the movie was so-so, the idea of “writing one’s way through” difficulties (even to write in order to find a reason for being) appealed to my inner writer and my outer writing teacher. Given our difficult moment, the fact that we both will be affected (eventually) by the furloughs, writing about this seems apropos. So here goes….

Although the University Corp (as a separate money-making entity of the university) is not really affected by the state budget cuts, in a symbolic demonstration of campus solidarity (to share the pain) the Corp offices are closed for the campuswide furlough days, Friday and Monday; although in an interesting use of language, these are now called “campus closure days.” This move, however, requires that Corp employees take vacation days to cover the down time at work (this is the pain).

One should be allowed to work, especially when one can, right? But we are in a weird time. One of my colleagues, a recently appointed administrator, admits that what is happening at Cal State is unprecedented. (Haven’t we heard that word used before in the past 10 years?) No one in the chancellor's office is giving much information, although there are “frequently asked questions [presumably answered?]” on both the Web sites of my campus and my faculty union. In a time when leadership matters, however, the chancellor is silent. This is, of course, aside from the pathetic (replete in his crumpled white shirt) video message he sent to faculty where he implored us to “tell friends and family what the CSU does.” (No wonder over 70 percent of the faculty also used the vote to also voice their lack of confidence in the chancellor: "Of those voting, only 4 percent said they had confidence in the chancellor’s leadership. 79 percent said they voted ‘no confidence’ and 17 percent responded ‘don’t know.’ "

One of my right-wing neighbors found this appropriate time to let me know her indignation with public education by saying, “Well, Renee, you ARE always on vacation.” Her husband followed up with a question, “So why is public education free?” Telling friends and family what the CSU does in this moment would be tantamount to giving history lessons (about John Dewey and Jane Addams) and reading off a list, “Well, during those ‘vacations’ I am often working on grants, writing my own research, mentoring students, attending classes, reading books, preparing for classes in the fall…” and the list goes on and on.

At the end of Georg’s furlough Friday, I finished reading a manuscript I am reviewing. But for most working class people, this sort of “work” (if they even call it that), is nothing compared to what they do when they go to work -- or so the thinking goes. Indeed, my right-wing neighbor has been also telling me about the terrible things that are happening at her job, about the vulnerability of older workers who are being forced into retirement, sometimes with good settlements and sometimes not, sometimes with their health insurance paid for and sometimes not. But management is fine. One of her bosses, after axing several employees, was promoted to vice president, along with a very nice bonus and salary.

In these unprecedented times, no one really knows what to do. Cal State faculty in a close tally voted to accept the furlough in lieu of systemwide layoffs and reductions in staff. On our department’s listserv, people are going back and forth voicing their concerns about serving students, while reducing “work time” by 9 or 10 percent. One colleague said that he felt it was immoral to under-serve students (who are under-served by education anyway). And what, he asked, are we do to anyway (in situations where faculty teach online); reduce our class content by 10 percent? Ask students to read only 90 percent of a book? Then we are supposed to take days off during the term. I do not know if my chair was joking when he said, “Just think…. You could go to a conference!” Wait a second, I nearly screamed. When I am at conferences, I am working, shoring up my skills so that I am a better professor and a better researcher.

At the end of the day, faculty will have a pay cut. We won’t work less. We will do amazing things with the resources we have and stretch those resources to the limit in order to bring students a quality education. Our amazing chair of some years ago put it succinctly when she said, “Like good Mexicans, we do a lot with very little.” In our department, I know, we will continue to be collegial, to watch out for each other, to do a lot with what we have. But from what I read in the newspapers, this is only the beginning of a very bad time. The real blood in the streets is supposed to flow next year, when there are no furloughs, when we will face layoffs of faculty and staff.

Renee Moreno
Author's email:

Renee Moreno is associate professor of Chicano Studies at California State University at Northridge.

The Accidental Celebrity

“There are two modes of establishing our reputation: to be praised by honest men, and to be abused by rogues. It is best, however, to secure the former, because it will invariably be accompanied by the latter.”

-- Charles Caleb Colton, Anglican clergyman (1780-1832)

One deleted e-mail marked the beginning of my ordeal. It was finals week, just before Christmas break, when I received a strange message asking me to comment on some kind of online political essay that I had supposedly written. Since I’m not a blogger and make it a point to avoid the many rancorous political forums on the Internet, I immediately dismissed it as spam and hit delete.

But the notes kept coming, increasing in their fervor and frequency, until I could no longer deny it: I was receiving “fan mail.” Some writers called me courageous. Others hailed me as a visionary. A few suggested that I was predestined to play a pivotal role in the apocalyptic events foretold in the Book of Revelation. (Seriously.) Now, over the past 12 years I have published a scholarly book and eight journal articles on various historical topics, but I have to admit that through it all I never even attracted one groupie. So with my curiosity very much piqued, I began an online quest in search of the mysterious article.

I suppose it was inevitable that I was not going to like what I found. There, prominently displayed on a rather extreme Web site, was an essay (information about it can be found here) that likened President Obama to ... Adolf Hitler. Underneath the title was the inscription “by Tim Wood.”

To say I was not pleased would be a colossal understatement. However, even though my parents always told me I was special, a quick Internet search will reveal that I am not, in fact, the world’s only Tim Wood. So I ignored the article -- at least until one of the versions of the essay being forwarded via e-mail mutated into a form which included the rather unambiguous phrase “Professor of History, Southwest Baptist University.” The writer of this message also helpfully appended my office phone number and e-mail address.

Stunned, I struggled to regain my bearings and tried to grasp the full implications of this professional identity theft. Beyond the fact that the comparison is utterly ridiculous (anyone who believes that truly has no understanding of the depths of evil plumbed by the Nazi regime), it was now personal. Who had the right to speak for me like that? How dare they hide behind my name! What if my colleagues -- or my friends and family -- read this and believed it?

But the most pressing question seemed to be what kind of damage control would be necessary in order to prevent this from irreparably damaging my career. And that, in turn, led me to begin reflecting on how scholars will need to safeguard their professional reputations in the 21st century. Although I would never wish this kind of ordeal on anybody, the realist inside me fears that I will not be the last professor to fall victim to digital dishonesty. As academics, we must be aware that our professional reputations are transmitted through the technology of a bygone era, and even then are typically shrouded in secrecy or obscurity. Mentors, colleagues, and administrators exchange sealed and confidential references printed out on university letterhead. Editors, referees, and reviewers validate our scholarly work by allowing us access to or giving us coverage in their publications, but the results of that process all too often lie buried in library stacks and academic databases. In the meantime, the malicious or misinformed denizens of the Web have had time to hit the “forward” button about a million times.

So what lessons have I learned through this ordeal? First of all, be proactive. Once these rumors hit a certain critical mass, ignoring them will not make them go away. Indeed, a situation like this becomes the ultimate test of one’s personal credibility in the workplace. Immediately after I discovered that my specific identity had become attached to that particular article, I treated myself to a tour of the university’s administration building. Everybody from my department chair, to my dean, to the provost, to the directors of human resources, information technology, and university relations heard my side of the story within 48 hours. In my case, I was fortunate enough to have retained the confidence and support of my administration. There is no substitute for goodwill.

Secondly, I tried to remain positive and to find the teaching moment hidden within all of this. I posted an item on the university’s faculty Web page that served both as a public disclaimer and an opportunity to emphasize to students (and anybody else who might read it) why it is that faculty constantly warn against an uncritical acceptance of materials found on the Internet. I reminded my readers that in history, scholars are trained to constantly analyze their sources. Always historians must be aware that the documents they are working with may contain errors, lies, omissions, distortions, or may even turn out to be wholesale forgeries. To navigate those potential pitfalls, scholars check facts and look for other documents that confirm (or contradict) the information found in our sources. We seek to identify the author and understand his or her motives for writing. We try to understand the larger historical and cultural context surrounding a document. By doing our homework, we are better able to judge when people deserve to be “taken at their word.”

This episode has also taught me a tough lesson in maintaining a professional demeanor, even in the face of outrageous provocations. Although the majority of people who wrote to inquire about the article were gracious, and many even apologized for the mistake, enough of my correspondents were belligerent and rude to make me dread opening my inbox every morning. Even after learning I was not the author, many readers clearly still expected me to lend my professional credibility to the essay, vouching for its accuracy and validating its interpretations. After reading my denial (where I explicitly refused to endorse the article’s contents), many supporters of the piece became abusive, writing back to attack the depth of my patriotism, the sincerity of my religious faith, and the integrity of the academic community in the United States in general.

Critics of the essay were not above lashing out either -- even in the absence of evidence. One disgruntled detractor wrote to inform me that my brand of “voodoo” and “fear-mongering” would soon be vanishing into irrelevancy, heralding the advent of a new Age of Reason. (Hopefully that individual’s definition of reason will eventually grow to include a commitment to basic research and fact-checking and an unwillingness to take forwarded e-mails at face value.) In the meantime, along with the angry rants, there came from the fever swamps of political paranoia long-discredited conspiracy theories, urging me to consider that the course of history was being determined by Jewish bankers, or the Jesuits, or the Illuminati, or even flesh-eating space aliens. Frequently at those junctures, I felt the temptation to fire back with a “spirited” and “colorful” rebuttal. However, I resisted for many reasons: because I am ultimately a firm believer in civility in public debate, because I did not want to embarrass the colleagues and administrators who had stood by me through this, and because arguing with people who have already made up their minds and have come to demonize those who disagree is almost always an exercise in futility.

Moreover, this incident has led me to reconsider my somewhat adversarial relationship with technology. (I’m the guy who still refuses to buy a cell phone.) But one of the greatest difficulties I encountered in all of this was finding a platform from which to launch a rebuttal. Although I did write personal replies to many of the people who wrote me inquiring about the article, it seemed clear that such a strategy alone was like battling a plague of locusts with a flyswatter. Instead, Internet rumors are best refuted by channeling people toward some definitive, universally available, online point-of-reference (a Web address, for instance) that exposes the lie. In my case, the university was kind enough to grant me access to a page on its Web site, and I quickly began disseminating the link to my posting. However, that solution may not be available to everyone who falls victim to this kind of a hoax, and I am beginning to believe this issue is far too important for faculty to leave to others anyway. A year ago, I would have considered the creation of an “official Tim Wood Web site” to be pretentious in the extreme. Today, I’m not so sure. Like it or not, faculty are public figures, and if we do not take the initiative to define ourselves in ways that are accessible and relevant to those outside the academy, we risk being defined by others in ways that suit their agenda, not ours.

Finally, confronting this situation has led me to take a fresh look at the qualities that make a good historian. In 1964 Richard Hofstadter, an influential scholar of American politics, wrote an article for Harper’s Magazine entitled “The Paranoid Style in American Politics.” In this passage, he describes a paranoia all too familiar in today’s political discourse:

As a member of the avant-garde who is capable of perceiving the conspiracy before it is fully obvious to an as yet unaroused public, the paranoid is a militant leader. He does not see social conflict as something to be mediated and compromised.... Since what is at stake is always a conflict between absolute good and absolute evil, what is necessary is not compromise but the willingness to fight things out to a finish. Since the enemy is thought of as being totally evil and totally unappeasable, he must be totally eliminated -- if not from the world, at least from the theatre of operations to which the paranoid directs his attention.

As author Dick Meyer pointed out in a 2005 CBS News article, this mentality has come to transcend political labels:

The great dynamic is that so many people .... are convinced that a malevolent opponent wants to destroy their very way of life and has the power to do so. Evangelical Christians may believe that gay marriage, abortion rights, promiscuous and violent popular culture, and gun control are all part of a plot to destroy their community of values. Urban, secular liberals may believe that presidential God-talk, anti-abortion legislators and judges, intrusive Homeland Security programs, and imperialist wars are part of a sinister cabal to quash their very way of life.

Historians often find themselves compared to storytellers, and are lauded for their ability to present compelling interpretations of the past and to craft powerful narratives. But perhaps equally as important is our role as listeners. In an increasingly divided society, consensus will never be achieved by shouting (or e-mailing) until we drown out all competing voices. Instead, the first steps toward reconciliation come by those who seek to understand all aspects of the question and try to remain mindful of the needs of others.

In any case, my battle continues. Monday I will go to work, try to sort through all the chaos, and do my best to help folks figure out the truth. (Which is probably pretty close to what I did before my identity was stolen, come to think of it....) And I will continue to contemplate the ways in which this experience will change the way I present myself as a professor and a historian. In the meantime, if any of you encounter any online rantings and ravings that claim to be by me, do not necessarily believe them. Things are not always what they seem.

Timothy L. Wood
Author's email:

Timothy L. Wood is an assistant professor of history at Southwest Baptist University in Bolivar, Missouri. He is the author of Agents of Wrath, Sowers of Discord: Authority and Dissent in Puritan Massachusetts, 1630-1655 (Routledge).


Subscribe to RSS - Life
Back to Top