You have /5 articles left.
Sign up for a free account or log in.

Students at the University of Minnesota have for years called for access to student course evaluations that they provide at the end of courses, saying they’ve got a right to know what peers have thought of the classes they’re considering. Now they might get their wish – at least part of it. The University Senate is considering a proposal to make student feedback about courses public. But student responses about questions specifically related to professors would remain private, in accordance with state privacy laws for employees.

The university hopes the data will help students make more informed class selections, and offer more comprehensive and relevant information than that which is currently available on student-driven feedback sites, such as RateMyProfessors.com (the Minnesota evaluations don’t have a question on instructor “hotness,” for example).

"We think this is an excellent step forward in providing students quality information,” said Robert McMaster, vice provost and dean of undergraduate education. “I’m not going to say anything negative about RateMyProfessors.com, but that’s a much different kind of thing than this rigorous, standardized approach.”

And while McMaster politely avoided the question of RateMyProfessor.com, faculty members and academic leaders at many campuses hate the site as much as some students love it. (Some students also dislike it, including Minnesota's own student government, which said third-party sites contain "polarized" and "unverifiable" data in a recent position statement asking for more transparency of official student feedback.) But at many campuses, the professors' and administrators' dislike for the site hasn't translated into giving students something more educationally meaningful to consider when evaluating potential courses.

In an email, Carlo DiMarco, senior vice president of strategic partnerships for MTV, which owns RateMyProfessors.com, defended the site's worth.

"What we love about RateMyProfessors is that it is 100 percent driven by college students," DiMarco said. "Each year, millions of students use the site to help plan their class schedules, making it a uniquely valuable resource for them. All of the praise and critiques that professors receive on the site come directly from students, which means our site does what students have been doing forever: checking in with each other — their friends, their brothers, their sisters — to figure out who’s a great professor." (DiMarco also noted that the website's "easiness" and "hotness" ratings do not factor into the overall quality rating, and that the site employs a third party to vet comments and ratings to ensure reliability.)
 
Under the Minnesota proposal, student evaluations would have 11 questions. The first half would elate to the professor specifically, such as whether he or she was clear, or prepared for class. Those would remain a private part of the instructor’s personnel file, in line with the Minnesota Data Practices Act, which prohibits the release of information about specific public employees. But the second half of the questions would relate to the course itself -- Was my interest in the subject matter stimulated by this course? Would I recommend this course to other students? – and go live in a university database starting this fall. Although the vast majority of courses are offered by one professor, McMaster said those with multiple sections would be coded by instructor.
 
One common criticism of student-driven feedback sites is their potential for low response rates, and that only students with something to complain about are driven to comment. And that's a fear that some professors have about even their own institutions' evaluations if they're only offered online. McMaster said the new system can't guarantee a high response rate but that professors may distribute the evaluation forms in paper in class or offer them online. Valkyrie Jensen, an officer in the Minnesota Student Association and co-author of the recent position statement, said everyone typically fills out student evaluations in her classes. "It's a chance to have your voice heard, regardless of a positive or negative impression, and most students appreciate the opportunity," she said.
 
The university previously created a way for professors to elect to make some evaluation data public to students (the Dartmouth University faculty is currently considering a similar "opt-in" release option), but less than 10 percent did – in part simply because it was an extra hurdle for them to cross, McMaster said. So Minnesota thought about changing the default status of evaluations to public, giving professors the opportunity to opt out, instead of in. But that didn’t pass muster with the university’s legal department, given state restrictions on releasing information about employee performance.
 
The proposal, which is up for a vote in the universitywide senate next month, is a kind of “compromise” between what students want and state law allows, and faculty members appear to generally be on board, said Joseph A. Konstan, chair of the department of computer science and engineering as well as the senate’s Faculty Affairs Committee.

“I think it’s a good compromise given the constraints that we’re under here,” he said. “I haven’t heard much in the way of complaints about it, much in the way of advance concerns, as with other proposals in the past.”

Konstan said he didn’t have a problem with faculty evaluations that evaluate faculty members by name, but, like McMaster, he said that the new process would still be a more reliable alternative to third-party feedback sites.

He added that “anything can happen when you get a large group of faculty together,” but that he believes the motion will pass. Konstan acknowledged general faculty fears about student evaluators – that students might translate an easy course to a good course, for example – but said that aggregate evaluative data, as would be included in the new student resource, tends to paint a reliable picture of a course.

Excluding inappropriate or discriminatory comments, Konstan said, “My experience has been that when student say things consistently, there’s usually something behind them.” (Interestingly, a 2011 study from the University of Wisconsin at Eau Claire suggested something similar, in relation to RateMyProfessors.com. The study found that 10 reviews showed about the same consensus about a professor as did 50 reviews, a much larger sample size, or even more. The study also found that the site's users are "likely providing each other useful information about quality of instruction.")

More than that, he said, professors “can’t consistently high scores by being easy. Students know when they’re not learning much.”

McMaster said the same about a public question about how many hours per week students spent preparing for the course. Rather than opting for the smallest workload, he said he believed students would use the information to fill out their schedules. A course that required 10 hours of work per week might not be right for a semester schedule that also included heavy-hitters such as chemistry and economics, for example, he said.

Students support the plan, although they’ve advocated for the Minnesota state legislature to amend the law so that even more information can be made public.

A recent editorial in the student newspaper, the Minnesota Daily, endorsed a motion by the Minnesota Students Association to make the student course evaluation process more transparent, and advocated changing the law to allow for the release of instructor-specific data without an instructor’s explicit permission.

“We would like to see lawmakers and university leadership change this law,” the editorial says. “Doing so would allow for more transparency without making access to teacher evaluations easily available to those without a serious reason to look at them.”

It continues: “Teacher evaluations are vital to the University’s ability to remain a quality and competitive institution.”

Making course evaluation data available to students isn’t unique to the University of Minnesota. Yale University, for example, has made data about courses and instruction available to students for years through its website -- although two students recently accused it of forcing them to shut down a shadow site they said made it easier for students to compare courses by ratings. The American Association of University Professors picked up on the story, writing about it in its blog, “Academe.” A university spokesman said that the university acted because the students were "scraping data" from a Yale site and modifying course evaluation information in a way that was objectionable to the faculty, and that earlier attempts to reach a resolution with the students were met with delay.

John K. Wilson is co-editor of the American Association of University Professors’ "Academe" blog and author of the Yale post. Wilson said he’s personally in favor of offering students more information, not less.

“A classroom is not a secret space that must be kept in the shadows,” he said. “Students already have informal networks for sharing this information. Why not give them better data?”

Next Story

More from Teaching