For All to See
To disclose or not to disclose. That’s the question some institutions have faced in recent years regarding students' evaluations of professors.
A proposal at Northwestern University to make all evaluations available may go before the Faculty Senate soon, and it is attracting both praise and criticism. The debate comes amid the growing popularity of professorial reviews on Web sites that have no ties to universities -- and, critics charge, no quality control.
Nearly 30 years ago, Northwestern faculty members first agreed to have evaluations, provided that professors could keep them under lock and key if they so chose. But times have changed.
“[Evaluations] have become commonplace on campus,” said Stephen Fisher, associate provost for undergraduate education at Northwestern. “Students look to them, and so do members of the administration and faculty members.”
Fisher estimated that only about 5 percent of professors currently ask that their evaluations not be posted online.
Northwestern’s General Faculty Committee will vote on February 1 to decide whether the proposal to require that evaluations be available should go before the Faculty Senate later this year, at which point every faculty member would vote. Fisher said that students have been pushing for the change for some time, and it was even part of a student government campaign last year. So far, faculty members have been weighing the pros and cons of exposing their grades.
Fisher said that there has been more concern over publishing comments, than over an average score compiled from evaluations. “One communications faculty member said there are studies that say negative comments linger longer than positive comments, and that a few bad comments could color the perception of that teacher disproportionately,” Fisher said. “If it’s numerical, [a small number of scores] would be dissipated.”
In the interest of generating better data, Northwestern last year required all students who want to view evaluations of professors who do not ask to have them withheld to first fill them out for courses they have completed. Fisher said evaluation completion about doubled, to nearly 80 percent.
Thomas Bauman, professor of musicology and chair of the General Faculty Committee, said that faculty members are discussing whether there should be exceptions to any mandate, such as for adjuncts, or first-time teachers “who might feel they need a buffer of some kind."
Faculty reactions to public evaluations, beyond Northwestern, are as varied as the critiques they themselves get. The most public of all evaluation systems is RateMyProfessors.com, which has over 700,000 professors, and which allows anyone with an e-mail address to post to their heart’s content. As students have become more demanding consumers of their education, various methods of assessing professors without meeting them have been born -- some by students, and others by colleges.
Some of the more controversial versions of course guides are founded and maintained by students. A group of students at Williams College started Factrak several years ago. Factrak is run by students, and does not draw data from Williams’ official evaluations. Rather, students can post at their leisure with both their individual scores and comments, and only students can see the evaluations. “Originally, there were accusations that only extreme views were posted,” said Evan Miller, a Williams student who helped develop Factrak. In response to the criticism, Factrak instituted a system where students can “agree” with comments, “so you can see if a view is extreme if nobody agrees, or if a professor is universally hated.” All of the Williams students interviewed said they prefer the in-house system to RateMyProfessora.com, primarily because it verifies that users are students. Factrak has made some faculty blood boil, however.
Alan White, a philosophy professor at Williams College, railed against Factrak in an article he wrote in The Williams Record. In an interview, White said that Factrak “can’t possibly work well. There are either too few entries, or so many that nobody has time to look through them. Either way, you don’t get a representative sample.” White added that systems that give average scores generally do nothing to establish statistical significance, so the numbers should not pass peer review. “It’s bad information that looks good,” he said. “That’s the worst kind of information.” White said he would prefer a system that gives students contact information for other students who have taken a particular professor. “Then I could talk to someone I knew and trusted, a kindred spirit ... or at least I could get a feel for the person,” he said.
Even some of the student run evaluation databases don’t verify that commentators are students. CULPA, the Columbia [University] Underground Listing of Professor Ability, was actually founded before RateMyProfessors.com, and apparently preceded it with the idea of not making sure only students can post.
Washington University in St. Louis drives the Rolls Royce of public evaluation systems. Beginning six years ago, Henry Biggs, director of undergraduate research, led a push to create a more informative evaluation process. “The questions being asked were not getting the required information,” Biggs said. “We had no way to know if a professor was struggling.” A faculty member from the anthropology department helped develop evaluation templates for different types of classes (lectures, seminars, labs, etc.), and then an online system was tested for over a year.
Professors have to petition if they want to withhold the evaluations from students, who can log in and see them. The student comments are not there, but a wealth of numerical data, including the number of respondents out of the total from a particular section, and the scores relative to the department and to all other professors in the system. Biggs said that about three-quarters of students fill out the evaluations, which went fully online in fall 2004. Professor can also tailor several questions specifically to their course. “The can ask about a particular text,” Biggs said. He said that he was recently interviewed by a student asking why the Wash U. system is any better than RateMyProfessors.com. “I’ve had several students say they go there to choose a course, which is disastrous,” Biggs said. “We worked on ours for four or five years [before going public]. We didn’t just throw up a site that could do evaluations.”
When evaluations are credible and informative, some of RateMyProfessors.com’s harshest critics are all for them. “The Professor,” an anonymous professor from a southern college who started the Rate Your Students blog as a comeback to RateMyProfessors.com, said in an e-mail that “One part of me, however, still wants to invoke the professorial defense: ‘I know what I'm doing; students don't -- yet.’” But added that “perhaps publicizing professors who weren't meeting student expectations would encourage professors to try and address their weaknesses.”
Search for Jobs