Essay on how much a professor learned about teaching from his mentor

In the summer of 1996, I spent two weeks driving around Greece with my girlfriend and my undergraduate adviser. We argued all the time: me and my girlfriend; me and my adviser; my girlfriend and my adviser. One stop was particularly memorable for its unenjoyableness. We spent a day and a night at Monemvasia, a fortified Crusader town on a massive rock off the coast. The whole time, my adviser berated me to learn more about the extensive history of the place and turned his nose up at my girlfriend, who wanted to find a nightclub on the island.

To be fair, my adviser was not actually on the trip. He was in my head, or rather, I had internalized him. I couldn’t have a conversation without hearing him remark on the substance (or lack thereof) of my comments. He haunted my relationships and my thoughts. I carried him everywhere, like Anchises on my shoulders.

As my adviser would have pointed out, that was Sartre's metaphor for the superego, which he (Sartre, not my adviser) claimed not to have, his father having died when he was two. And perhaps that’s all my adviser was, a pumped-up academic superego, driving me to know more, to be less dumb, to write better.

He -- and he had a name, Antoine Raybaud, and a face: sea-blue eyes that burned when he stared, a beaked nose, broad smile and churlish gray curly hair -- would have given growth hormones to anyone's superego. His lectures were like Stéphane Mallarmé's salons, two hours of noteless improvisations on poetry and artistic creation. His seminars were fearsome: like a cat toying with its prey, he would hide the answers to his questions in ambiguous phrases, leaving us dangling in confusion. When the inevitable wrong answer was proffered, he would bat us away with a “Non, non, c’est pas ça du tout.” And then a long, oppressive silence would ensue, until another foolhardy student would offer up a sacrificial comment.

He taught in French, because this took place at the University of Geneva, where I was a student. Raybaud himself was French, a graduate of Normale Sup’, the elite French university for future academics. He had come to Geneva to replace the legendary Jean Starobinski, one of the greatest literary critics of the twentieth century. I had known none of this when choosing French as my main subject at the university. But Raybaud was well aware of his place in institutional history: perhaps he could hear Staro’s voice in his own head, belittling his lectures.

The atmosphere in Raybaud’s seminars was so tense that every detail of that room is seared into my memory. The tables, arranged in a long rectangle, with a no man’s land in the middle; the door to the hallway, at the center of the room, always slightly ajar; a mobile whiteboard in front of one window; and then, beyond, the tantalizing views of the Salève mountain and the chestnut trees in the Parc des Bastions -- their beauty all the more wrenching when students were driven to tears by Raybaud’s caustic remarks on their presentations.

I didn’t have to take his classes. Still, a tiny group of us kept on coming back. Despite the hardships, Raybaud’s classes were mesmerizing. He interpreted texts like a magician, making meaning appear where we could only see words. The seminars became less painful, as Raybaud slowly warmed to us. But he never relented in his expectations. Every single paper I submitted to him, from my first essay to my final thesis, he made me rewrite. Once, on my way to his office, I bumped into him in the hallway; he glanced at the first few paragraphs of my assignment, then handed it back, saying, “Allez, refaites-moi ça.” (“Do it over.”) I went home and spent hours trying to figure out what I had done wrong. Eventually I rewrote the entire paper; even I could tell that it turned out much better.

Natacha, Bernard and I were his last students; he retired the year we graduated. His last seminars were luxurious: we spent six months, just the four of us, reading “Un Coup de Dés.” During that last seminar, it became clear we were initiates. We had come close to being broken, but had broken through.

I often wonder whether Raybaud’s tough love wasn’t the best pedagogy I could have received. I don’t dare repeat his method on my own students. But I fear I may be failing them by being too friendly, by not pushing them to their limits, not giving them a chance to surpass themselves. This is not a teaching style for all students, to be sure. But I know that without his punishing comments, I would be a lesser scholar today.

While Raybaud did not do much to enhance my vacations or relationships, having that voice in my head, for so many years, was not always a bad thing. By forcing me to rewrite every paper I handed in, he turned me into my toughest critic. I needed to internalize him, if I wanted to make any progress. Had Raybaud merely told me what was wrong with my arguments, I would never have learned the most important lesson of all: how to spot my own weaknesses.

After six years -- which was how long most of us spent in our studies, since we paid next to nothing -- the demon who had hounded me across Greece had become a friend. We were close, if not intimate: I could never bring myself to call him “tu,” even though he encouraged me to call him Antoine. He shared his own disappointments, or what he called his “insuccès”: not failure, but something almost worse, a lack of recognition.

But his vulnerability taught me a final lesson. Anchises is not really someone else, only your own voice in disguise. Raybaud was the name I gave to a part of my mind I’ve since recognized as my own. He no longer disrupts my vacations or family life, but without his rigor, there is a side of myself I may never have discovered. Antoine died in 2012, and I never had the chance to reveal just how much I owed him. But a part of him will always be a part of me, reminding me that, no matter how painful or tiring, I should really rewrite that paper one more time.

Dan Edelstein is professor of French and, by courtesy, history, and the W. Warren Shelden University Fellow in Undergraduate Education at Stanford University.

Editorial Tags: 
Image Caption: 
Antoine Raybaud

Essay on diversity issues and midcareer faculty members

Midcareer minority faculty members face particular challenges, write Joya Misra and Jennifer Lundquist.

Job Tags: 
Ad keywords: 
Editorial Tags: 
Show on Jobs site: 

A Toni Morrison novel makes it into a curriculum known for its classical roots

Smart Title: 

Columbia's core literature course, long criticized for its lack of diversity, adds to the list a novel by Toni Morrison -- the first living author and the first nonwhite author.

Commentary on American mass shootings

Only satire can look certain horrible realities in the eye, as The Onion did with its article from last year about a lone-wolf mass shooting of random strangers. Its headline cut to the quick: “‘No Way To Prevent This,’ Says Only Nation Where This Regularly Happens.”

It’s the real American exceptionalism. Rampage shootings do take place in other countries (the 1996 Dunblane school massacre in Scotland, for example), but rarely. They remain distinct events in the public memory, rather than blurring together. In the United States the trauma is repetitive and frequent; only the location and the number of victims seem to change.

With Charleston we have the additional grotesquerie of a presidential candidate calling Dylann Roof’s extremely deliberate act an “accident” while the director of the Federal Bureau of Investigation made a point of denying that it was terrorism. (The shooter was an avowed white supremacist who attacked an African-American church and took care to leave one survivor to tell the tale. By no amount of semantic weaselry can it be described as anything but “[an] act of violence done or threaten[ed] to in order to try to influence a public body or citizenry,” to quote the director's own definition of terrorism.) But American rampage shootings do not always express an ideological agenda, or even a motive intelligible to anyone but the gunman. The meaninglessness of the violence, combined with its regularity, is numbing. So with time our scars grow callused, at least until the next spree rips them open again.

A few years ago Christopher Phelps, an intellectual historian who happens to be my closest friend, moved with his family to England, where he is now a senior lecturer (i.e., an associate professor) in American and Canadian studies at the University of Nottingham. At some point the British media began turning to him for commentary on life in these United States. “I tend to be asked on radio shows when there's a need for American expertise -- and implicitly an American accent, which adds an air of authenticity,” he wrote in an email when I asked him about it.

Among the topics he’s been asked about are “the re-election of Obama, the anniversary of JFK's death, and even what comprises the East Wing of the White House, since one only ever hears about the West Wing.” Of late, America’s everyday mayhem keeps coming up. In 2013 he discussed the Trayvon Martin case. Last August, it was the girl whose training in automatic weapons on an Arizona firing range ended when she lost control and sprayed her instructor with bullets. Phelps appeared on a nationally broadcast talk show hosted by Nick Ferrari, which seems like the perfect name for a bigger-than-life radio personality.

Ferrari wasted no time: “What is it with Americans and guns?” he asked. A fair question, though exceedingly blunt.

“I should have anticipated that, I suppose,” Phelps says now, “but I froze like the proverbial deer in the headlights, stuttering away.” Since then, unfortunately, he has gained experience in answering variations of the question. “The producers need people to do it,” he explains, “the university media team work hard to set up the gigs, and you feel as an American you should step in a bit to help modulate the conversation, but it sweeps away my life for a day or two when I have other plans and some psychopath shoots up America.” (The BBC program for which he was interviewed following the Charleston shootings can be found here.)

It is still depressing,” Phelps continues, “in fact draining, to be put in the position of explaining my people through this kind of event, but reflection has prompted some better ways of answering.”

A one-sentence question about the American pattern of free-range violence takes many more to address at all concretely. Phelps's assessment bears quoting at length:

“While I'm as drawn to generalities as anyone -- I've always thought there was something to H. Rap Brown's declaration that ‘violence is as American as cherry pie’ -- it’s important to realize that most American households do not possess guns, only a third do. So gun owners do not comprise all Americans but a particular demographic, one more white, male and conservative than the general population.

“The shooters in mass killings, likewise, tend to be white men. So we need to explain this sociologically. My shorthand is that white men have lost a supreme status of power and privilege, given a post-’60s culture claiming gender and racial equality as ideals, yet are still socialized in ways that encourage aggressiveness.

“Of course, that mix wouldn't be so dangerous if it weren't easy to amass an arsenal of submachine guns, in effect, to mow people down. Why do restrictions that polls say Americans consider reasonable always get blocked politically, if gun-owning households are a minority? For one thing, the gun manufacturing corporations subsidize a powerful lobby that doubles as a populist association of gun owners. That, combined with a fragmented federalist system of government, a strongly individualist culture and the centrality of a Constitution that seems to inscribe ‘the right to bear arms’ as a sacred right, makes reform very difficult in the United States compared to similarly positioned societies. This suggests the problem is less cultural than political.”                      

Following the massacre of 26 people, most of them children, at Sandy Hook Elementary School in Connecticut in 2012, National Rifle Association executive vice president Wayne LaPierre waited several days before issuing a statement. Whether he meant to let decent interval pass or just needed time to work up the nerve, his response was to blame our culture of violence on… our culture of violence.

He condemned the American entertainment industry for its constant output of “an ever more toxic mix of reckless behavior and criminal cruelty” in the form of video games, slasher movies and so forth. The American child is exposed to “16,000 murders and 200,000 acts of violence by the time he or she reaches the ripe old age of 18” -- encouraging, if not spontaneously generating, LaPierre said, a veritable army of criminals and insane people, just waiting for unarmed victims to cross their paths. “The only way to stop a monster from killing our kids,” he said, “is to be personally involved and invested in a plan of absolute protection.”

The speech was a marvel of chutzpah and incoherence. But to give him credit, LaPierre’s call for “a plan of absolute protection” had a sort of deluded brilliance to it -- revealing a strain of magical thinking worthy of… well, when you get right down to it, a violent video game. Despite living in a society full of people presumably eager to act out their favorite scenes in Natural Born Killers and American Psycho, having enough firepower will give you “absolute protection.”

On many points, Firmin DeBrabander’s book Do Guns Make Us Free? Democracy and the Armed Society (Yale University Press) converges with the analysis quoted earlier from my discussion with Christopher Phelps. But DeBrabander, an associate professor of philosophy at Maryland Institute College of Art, places special emphasis on the corrupting effect of treating the Second Amendment as the basis for “absolute protection” of civil liberties.

The vision of democracy as something that grows out of the barrel of a gun (or better yet, a stockpile of guns, backed up with a ready supply of armor-piercing bullets) involves an incredibly impoverished understanding of freedom. And it is fed by a paranoid susceptibility to “unmanageable levels of fear,” DeBrabander writes, and “irrationalities that ripple through society.”

He turns to the ancient Stoic philosophers for a more robust and mature understanding of freedom. It is, he writes, “a state of mental resolve, not armed resolve. Coexisting with pervasive threats, Seneca would say, is the human condition. The person who lives with no proximate dangers is the exception. And it’s no sign of freedom to live always at the ready, worried and trigger-happy, against potential threats; this is the opposite of freedom.” It is, on the contrary, “a form of servitude,” and can only encourage tyranny by demagogues.

“Freedom,” DeBrabander goes on to say, “resides in the ability to live and thrive in spite of the dangers that attend our necessarily tenuous social and political existence -- dangers that are less fearsome and debilitating to the extent that we understand and acknowledge them.” It is only one of many good points the author makes. (See also his recent essay “Campus Carry vs. Faculty Rights” for Inside Higher Ed.) And the certainty that another mass shooting will take place somewhere in the United States before much longer means we need all the stoicism we can get.

Editorial Tags: 

Ohio's largest community college receives unprecedented gift for humanities program

Smart Title: 

In an era when the humanities are overlooked or derided by politicians, one Ohio community college landed a $10 million grant to boost the liberal arts.

Colleges award tenure

Smart Title: 

The following individuals have recently been awarded tenure by their colleges and universities:

Iona College

  • Marcus Aldredge, sociology
  • Manuel Gomez, foreign languages
  • Andrew Griffith, accounting
  • Jaeyoung Kang, management
  • Joshua Klein, criminal justice
  • John Theodore, psychology

Middlebury College

  • Joyce Mao, history
  • Sarah Stroup, political science
  • Jessica Teets, political science

University of Kansas

Essay on being accused of being an anti-Israel professor

Let me tell you how I ended up on Jihad Watch. This is a tale of the new red scare wending its way across college campuses. More than an account of my own travails, this is an anatomy of how critical thought about Islam and Judaism, the Arab-Israeli conflict, anti-Semitism and anti-Muslim racism is today monitored in the academy with the goal of chilling reflection.

In March, at the University of Rochester, I gave a lecture entitled “Judeophobia and Islamophobia” in which I sought to consider the links between Muslims and Jews in contemporary European and American discourse and put it into historical perspective. In attendance was an appointed watchdog for Campus Watch, A. J. Caschetta, a lecturer in English at the Rochester Institute of Technology.

In May, he published his “report” of my talk on the website of the Middle East Forum. It was a pastiche of falsehoods, innuendos and quotes out of context, entirely obfuscating what I actually said. I was accused of maintaining that Islamophobia has replaced Judeophobia, an indefensible position given the rising tide of anti-Semitism globally. It was also alleged that I deny the history of Islamic Judeophobia historically and at present. These charges stem from the fact that I sought to consider the two forms of hatred in tandem. While it is demoralizing to suffer through this kind of defamation, the real harm is the way anti-anti-Semitic hit men like Caschetta feed hate speech.

I had a sense something had happened in the blogosphere when I began to receive anti-Islamic hate mail in my inbox, and requests for the lecture from as far away as Sydney, Australia. This happened because Campus Watch flies its flag under the auspices of the Middle East Forum, a well-financed initiative under the leadership of Daniel Pipes that monitors Middle East studies in the academy.

Campus Watch is part of a network of networks, including StandWithUs, AMCHAInitiative, the David Horowitz Freedom Center and most recently Canary Mission, linked to groups like Jihad Watch. Jihad Watch and these other fora send daily blasts to all those who sign up to receive them on their websites and use email and social media to share their message. Within this self-referential set of bubbles, each consumes the propaganda of their fellow warriors in what they describe as a war for hearts and minds. College campuses are thus key strategic territory in the battle since this is where young minds are shaped.

In her final chapter of The Origins of Totalitarianism, “Ideology and Terror: A Novel Form of Government,” Hannah Arendt suggested that what linked Stalinism and Nazism was the reduction of history to ironclad laws, whether race or class. What they shared in common was the truth about the movement of history. Today the “clash of civilizations” has cemented as this new truth.

What I sought to accomplish in my lecture was a form of ideology critique. I did so by reflecting on a series of narratives that have emerged in the wake of the Charlie Hebdo and Hyper Cacher murders that have split off the ideology of the Koachi brothers and Coulibaly from the sociology of their marginalized experience as Muslims in France.

I insisted that such a split, whether by the right or the left, is untenable if we seek to understand such events. My example was Lassana Bathily, the Hyper Cacher worker of Malian Muslim background who saved Jews by hiding them in the freezer of the kosher market. His story short-circuits the narratives about an essentially radical Islam, as well as the story about how oppression leads people to terrorism as the weapon of the weak.

I went on to discuss the history of the concepts of anti-Semitism (which was coined in the 1870s, racializing the much longer history of anti-Jewish prejudice) and Islamophobia (which was birthed as a term only in this generation but whose history goes back to the Middle Ages).

Then I addressed the vexing question of whether anti-Semitism should be hyphenated. The minutia of the hyphen actually has major consequences in how we think about the relationship between Muslims and Jews over time, and how this has changed in the last century. Those scholars who refuse to hyphenate anti-Semitism insist that “antisemitism” only applies to Jews and has always only applied to Jews. They also tend to insist that antisemitism is a unique form of racism, wholly different from anti-black or anti-Muslim discrimination.

But in the 19th century, when the term “Semite” was defined in opposition to “Aryan,” this was carried out in scientific, literary and artistic works that not only racialized much earlier tropes of Jews, but also images of Arabs, Saracens, Turks and Muslims. The two groups were unified by their shared Semitic language family.

I referenced a set of historical examples of this long history: the Crusades, which gave rise to the first mass killings of Jews en route to liberating holy sites in Jerusalem held by Saracens; the Fourth Lateran Council (1225), which mandated marking not only Jewish but also Muslim clothing; the Spanish Inquisition, which targeted not only Jews but Moors; post-expulsion Europe, when 90 percent of Jews lived under the crescent of Islam; the depiction of Jews as Turks in Renaissance art, as in many paintings by Rembrandt; and I cited authors like writer Johann Gottfried von Herder, who called Jews the “Asiatics of Europe,” and Benjamin Disraeli, who said the Jews were an “Arabian tribe” and the Arabs “only Jews upon horseback.”

I then explained that the sometimes overlapping images of Jews and Muslims were definitively decoupled around the time that the construct “Judeo-Christian” made its historical appearance in the 1930s. “Judeo-Christian” was originally a formula used to appeal to Christians to aid Jews who were targeted for annihilation in Europe by the Nazis. It was effective because it stressed a shared lineage.

Following the Holocaust and with the creation of the state of Israel, Jews stressed their Judeo-Christian commonality, which over time was interpreted as the foundation of Western civilization, and later of American democracy and human rights. What made the decoupling definitive was that this was precisely the period when large swaths of the Islamic world began to demonize Jews in unprecedented ways, drawing upon the iconography of the European anti-Semitic arsenal, spurred by the Arab-Israeli conflict.

This quick history is certainly not the whole story of either Judeophobia or Islamophobia -- their linkages and disconnects -- and only the briefest outline of what I addressed in my public lecture. But in talking about rising Judeophobia globally since 2000, I ended by explicitly critiquing the position that was then used as the title of Caschetta’s article: “Are Muslims the new Jews?” The entire point of what I discussed was to problematize such one-sided views.

The ear attuned only to ideology, as Arendt defined it, is tone-deaf to such deconstruction. The real jihadists don’t want to think critically and contextually. The narrative of the “clash of civilizations” explains everything to them. This is true of those warriors of the faith who seek who oppose the “Zionist-Crusader conspiracy” and restore the Caliphate just as much as for those crusaders who pull Judeophobic passages from the Quran and insist they meant the same thing in the eighth century as they have come to mean in the new millennium, as Caschetta did during the Q&A session.

Ideology, as Arendt suggested, is underpinned by an ahistorical belief in the truth of your understanding of the motor of history. Ideology critique is what some corners of the academy offer at its best. This is precisely why the new McCarthyism monitors its lecture halls with watchdogs. The Campus Watchers don’t want students to reevaluate and reframe the latest well-worn clichés. But not doing so stokes hate speech, and this can feed violence.

So what do I tell the members of my synagogue, fellow parents at the Jewish day school my kids attend, my colleagues in Jewish studies associations in America and Europe about why I ended up on Jihad Watch? I tell them the new McCarthyism has arrived.

Jonathan Judaken is the Spence L. Wilson Chair in the Humanities and professor of history at Rhodes College.

Editorial Tags: 

Essay about a professor who learns his son has discovered

There are professors who find student comments on their end-of-semester evaluations so upsetting that they cry after reading them. If my course evaluations have tended to be pretty good, I can still relate to how faculty members feel, thanks in part to Side by side on the site stand evaluations from students who gave me high marks and others who gave me low marks -- in the same exact areas. I can see a host of negative and mediocre rankings for classes I taught very differently some seven years ago -- and one that I never taught at all. I’ve peeked there from time to time and have tried to learn what I could from things students said there, but have not spent a terrible lot of time on the site.

I’ve tended to view what one finds there as akin to YouTube comments -- most are gushing praise or insulting jibes, with very little middle ground. And most people will never comment, leaving the soapbox to those who feel strongly, having had a positive experience or, more likely, a negative one.

And so you can imagine my dismay when my son, who is in high school, told me that he had looked me up on I'm sure a worried look crossed my face, but I tried my best to retain my composure. Apparently had been mentioned on Reddit recently, and anything that is featured prominently on Reddit, my son spots -- almost but not quite always before I do.

As it turned out, my apprehension was unnecessary. As the conversation progressed, I found myself really impressed by my son’s thoughts regarding the comments about my classes that he saw there. One student on the site had written, "I would advise not taking his class because he can't keep the class discussion going." Another complained, "He wasn't good at stimulating conversation." My son, despite still being several years away from university, was astonished by such comments. How, he asked, can students have the audacity to blame the professor for something that is the responsibility of the students themselves?

A conversation that began with fear and trepidation on my part ended with a sense of satisfaction. I had always taken comments on with a grain of salt. But it was reassuring to realize that a young person, still a student, without my prompting, could draw the same conclusion, based simply on what he knew about online reviews and things that he learned on Reddit. We often despair for humanity reading online comments, whether they are on YouTube, Reddit or RateMyProfessors.

Usually, when it comes to course evaluations, the fact that students are required/compelled/pressured to complete them means that one has a wider range of useful data to work with. If most students have filled in evaluations, and all the comments and ratings are similar, then you know that you really are doing a good/terrible job -- the results are statistically significant. If one only had course evaluations from students who hated or loved the class enough to fill them in, one would probably have a distorted perception of what one has accomplished, whether that perception errs on the side of being too negative or too positive. As with the complaints, the flowing praise of the course and its instructor some students offer may have as much to do with their own work habits and motivation as anything the professor did.

Comments (whether on official evaluations or RateMyProfessors) become less discouraging as one’s career progresses, because one becomes aware of what one is and isn’t able to control. I’ve taught two sections of the exact same class, with the exact same syllabus, back to back on the same days of the week in the same semester. The students from one of those classes gave me some of the highest ratings on the course evaluations that I’ve ever gotten; the students from the other gave me some of the lowest.

I learned some really important lessons from that experience. One was to avoid teaching two sections of the same course in the same semester if I can. But another was that you can do essentially the same things in a classroom, and it is not guaranteed to either succeed or to flop.

Unless you are still approaching classes in the traditional lecture mode, with students expected to write down and reproduce what you say, then your role is probably more like that of a coach. The same coach can work with two different groups of students on the same team at the same university, but they will not necessarily have comparable successes and failures. Because we know that, however much we hold coaches responsible, ultimately it is up to the players on the team to put the training that they are given into practice, to translate it into effective playing in games. In the same way, the same course materials may work really well with one group of students and less well with another. That doesn’t mean the students were necessarily less hardworking. Sometimes it is about their prior knowledge or personality types rather than their motivation or diligence.

I’ve heard lots of faculty complain about “kids these days.” I think such complaints are misguided -- and not just because my son’s reaction to gives me great hope for “kids these days.” I think we as faculty members are prone to forget that, in many cases, we were not typical students as undergraduates. Those who go on to pursue Ph.D.s and become professors are often those who enjoy learning for its own sake. Don’t you remember there being others in your classes who didn’t participate in discussions, didn’t read beyond the bare minimum, if that, and were content just to drift through classes?

None of the things discussed here are due to new technology, either. Even before there was, students were spreading the word about professors. And students didn’t require electronic devices to be distracted or tune you out and then rate you negatively for it. I remember early in my teaching career having my department chair tell me that he had heard from another faculty member, who had heard from a student, that I tend to drone on and on in an uninteresting manner in class. The chair sat in on my class soon after that. The discussion was lively, and he was thoroughly happy with it. But there was one student who sat flipping through a magazine or catalog the entire time -- with my department chair sitting right next to them! I can’t help but suspect that that student was the one who felt the class was boring.

It reminds me of this exchange in the Friends episode “The One With Joey’s Fridge”:

Monica: What’s the charity?

Rachel: I don’t know, something either trees or disease -- Ralph mumbles a lot.

Monica: Does Ralph mumble when you’re not paying attention?

Rachel: Yeah! It’s weird…

While there are exceptions, most students who are motivated and diligent do not find even a truly boring professor who mumbles a lot to be a hindrance to learning.

This is not to say that I haven’t undertaken efforts to improve. I have done so, even on the basis of comments on One thing I never learned to do earlier in my career was how to use my voice properly. And so I took singing lessons -- in part because of musical interests I happen to have, but also because I suspected that this would improve the clarity of my communication.

I recorded my lectures using Panopto, partly because I wanted to try out the “flipped classroom” approach, but also in part because I wanted to take advantage of the opportunity this technology afforded to listen to what I sound like in class. I made conscious efforts to deal with “ums” and other verbal habits. And I think that these efforts have done more to give students a better experience in my classes than any changes I’ve made with respect to a syllabus or a textbook.

And so what’s the takeaway message of this experience? A number of things come to mind. One is the fact that some things will always depend on the student. Some students will take comments on seriously, and as a result may never take your class, or may take it and then be disappointed that you did not seem as stellar as that review on the website led them to believe you would be. You can do the same things and they may work well for some students and not for others. But that doesn’t mean that there aren’t things you can do to improve. And in the process of developing your teaching ability even into the middle of your career and beyond, you can model lifelong learning to your students in ways that may help them to take responsibility for their own learning.

But ultimately, I think the biggest takeaway message is that there are students out there who have the understanding to perceive what a site like does and doesn’t tell you. And those students will, I suspect, be the very ones who will have the understanding to perceive what your course is about and engage with it in ways that are conducive to their own learning.

My son had a favorite comment about me from, and it was this one:

Having been reminded about this comment, I am seriously tempted to make it my Facebook banner. I can live with being described as a jolly leprechaun -- especially by someone who appreciates meaningful discussions about the spirituality and philosophy of science fiction, and who is capable of spelling "leprechaun" correctly, to boot!

I hope this article will provide some encouragement to faculty who feel beaten down and discouraged as a result of comments on But if it doesn’t make you feel better, then you can always try using I don’t think it will convey any more useful information about what students are like than ratings do about professors. But ultimately, venting is about catharsis, and when we recognize that students and professors do that at times, we will be better poised to learn what we can from the feedback we receive, and having done so, to then move on.

James F. McGrath is Clarence L. Goodwin Chair in New Testament Language and Literature at Butler University. He is the author of John’s Apologetic Christology and The Only True God and editor of Religion and Science Fiction (with Andrew Crome) and Time and Relative Dimensions in Faith: Religion and Doctor Who. He blogs at Exploring Our Matrix, where a precursor to the present essay first appeared.

Editorial Tags: 

Review of Edna Greene Medford, "Lincoln and Emancipation"

Reading the Emancipation Proclamation for the first time is an unforgettable experience. Nothing prepares you for how dull it turns out to be. Ranking only behind the Declaration of Independence and the Constitution in its consequences for U.S. history, the document contains not one sentence that has passed into popular memory. It was the work, not of Lincoln the wordsmith and orator, but of Lincoln the attorney. In fact, it sounds like something drafted by a group of lawyers, with Lincoln himself just signing off on it.

Destroying an institution of systematic brutalization -- one in such contradiction to the republic’s professed founding principles that Jefferson’s phrase “all men are created equal” initially drew protests from slave owners -- would seem to require a word or two about justice. But the proclamation is strictly a procedural document. The main thrust comes from an executive order issued in late September 1862, “containing, among other things, the following, to wit: ‘That on the first day of January, in the year of our Lord one thousand eight hundred and sixty-three, all persons held as slaves within any State or designated part of a State, the people whereof shall then be in rebellion against the United States, shall be then, thenceforward, and forever free….’”

Then -- as if to contain the revolutionary implications of that last phrase -- the text doubles down on the lawyerese. The proclamation itself was issued on the aforesaid date, in accord with the stipulations of the party of the first part, including the provision recognizing “the fact that any State, or the people thereof, shall on that day be, in good faith, represented in the Congress of the United States by members chosen thereto at elections wherein a majority of the qualified voters of such State shall have participated, shall, in the absence of strong countervailing testimony, be deemed conclusive evidence that such State, and the people thereof, are not then in rebellion against the United States.”

In other words: “If you are a state, or part of a state, that recognizes the union enough to send representatives to Congress, don’t worry about your slaves being freed right away and without compensation. We’ll work something out.”

Richard Hofstadter got it exactly right in The American Political Tradition (1948) when he wrote that the Emancipation Proclamation had “all the moral grandeur of a bill of lading.” It is difficult to believe the same author could pen the great memorial speech delivered at Gettysburg a few months later -- much less the Second Inaugural Address.

But to revisit the proclamation after reading Edna Greene Medford’s Lincoln and Emancipation (Southern Illinois University Press) is also a remarkable experience -- a revelation of how deliberate, even strategic, its lawyerly ineloquence really was.

Medford, a professor of history at Howard University, was one of the contributors to The Emancipation Proclamation: Three Views (Louisiana State University Press, 2006). Her new book is part of SIUP’s Concise Lincoln Library, now up to 17 volumes. Medford’s subject overlaps with topics covered by earlier titles in the series (especially the ones on race, Reconstruction and the Eighteenth Amendment) as well as with works such as Eric Foner’s The Fiery Trial: Abraham Lincoln and American Slavery (Norton, 2010).

Even so, Medford establishes her own approach by focusing not only on Lincoln’s ambivalent and changing sense of what he could and ought to do about slavery (a complex enough topic in its own right) but also on the attitudes and activities of a heterogeneous and dispersed African-American public with its own priorities.

For Lincoln, abolishing the institutionalized evils of slavery was a worthy goal but not, as such, an urgent one. As of 1860, his primary concern was that it not spread to the new states. After 1861, it was to defeat the slaveholders’ secession -- but without making any claim to the power to end slavery itself. He did support efforts to phase it out by compensating slave owners for manumission. (Property rights must be respected, after all, went the thinking of the day.) His proposed long-term solution for racial conflict was to send the emancipated slaves to Haiti, Liberia, or someplace in Central America to be determined.

Thanks in part to newspapers such as The Weekly Anglo-African, we know how free black citizens in the North responded to Lincoln, and it is clear that some were less than impressed with his antislavery credentials. “We want Nat Turner -- not speeches,” wrote one editorialist; “Denmark Vesey -- not resolutions; John Brown -- not meetings.” Especially galling, it seems, were Lincoln’s plans to reimburse former slave owners for their trouble while uprooting ex-slaves from land they had worked for decades. African-American commentators argued that Lincoln was getting it backward. They suggested that the ex-slaves be compensated and their former masters shipped off instead.

To boil Medford’s succinct but rich narrative down into something much more schematic, I’ll just say that Lincoln’s cautious regard for the rights of property backfired. Frederick Douglass wrote that the slaves “[gave] Mr. Lincoln credit for having intentions towards them far more benevolent and just than any he is known to cherish…. His pledges to protect and uphold slavery in the States have not reached them, while certain dim, undefined, but large and exaggerated notions of his emancipating purpose have taken firm hold of them, and have grown larger and firmer with every look, nod, and undertone of their oppressors.” African-American Northerners and self-emancipating slaves alike joined the Union army, despite all the risks and the obstacles.

The advantage this gave the North, and the disruption it created in the South, changed abolition from a moral or political concern to a concrete factor in the balance of forces -- and the Emancipation Proclamation, for all its uninspired and uninspiring language, was Lincoln’s concession to that reality. He claimed the authority to free the slaves of the Confederacy “by virtue of the power in me vested as Commander-in-Chief, of the Army and Navy of the United States in time of actual armed rebellion against the authority and government of the United States, and as a fit and necessary war measure for suppressing said rebellion.”

Despite its fundamentally practical motivation and its avoidance of overt questions about justice, the proclamation was a challenge to the American social and political order that had come before. And it seems to have taken another two years before the president himself could spell out its implications in full, in his speech at the Second Inaugural. The depth of the challenge is reflected in the each week's headlines, though to understand it better you might want to read Medford's little dynamite stick of a book first.

Editorial Tags: 

Essay on how to talk about assessment in faculty job interviews

At many interviews for faculty jobs these days, you'll be asked about assessment. Melissa Dennihy offers some ideas on how to answer.

Job Tags: 
Editorial Tags: 
Show on Jobs site: 


Subscribe to RSS - Humanities
Back to Top