Validation for

Study finds high correlation between reviews on the Web site professors love to hate and in formal evaluation system used by 275 colleges.
April 25, 2008

You've heard the reasons why professors don't trust, the Web site to which students flock. Students who don't do the work have equal say with those who do. The best way to get good ratings is to be relatively easy on grades, good looking or both, and so forth.

But what if the much derided Web site's rankings have a high correlation with markers that are more widely accepted as measures of faculty performance? Last year, a scholarly study found a high correlation between and a university's own system of student evaluations. Now, a new study is finding a high correlation between RateMyProfessors and a student evaluation system used nationally.

A new study is about to appear in the journal Assessment & Evaluation in Higher Education and it will argue that there are similarities in the rankings in and IDEA, a student evaluation system used at about 275 colleges nationally and run by a nonprofit group affiliated with Kansas State University.

What is notable is that while gives power to students, IDEA gives a lot of control over the process to faculty members. Professors identify the teaching objectives that are important to the class, and those are the measures that count the most. In addition, weighting is used so that adjustments are made for factors beyond professors' control, such as class size, student work habits and so forth -- all variables that RateMyProfessors doesn't really account for (or try to account for).

The study looked at the rankings of 126 professors at Lander University, in South Carolina, and compared the two ratings systems. The findings:

  • Student rankings on the ease of courses were consistent in both systems and correlated with grades.
  • Professors' rankings for "clarity" and "helpfulness" on correlated with overall rankings for course excellence on IDEA.
  • The similarities were such that, the journal article says, they offer "preliminary support for the validity of the evaluations on"

The study was conducted by Michael E. Stonntag, who formerly taught at Lander and who is now vice president for academic affairs at the University of Maine at Presque Isle, and by two psychology professors at Lander, Jonathan F. Bassett and Timothy Snyder.

Sonntag said that there are two ways to read the results: One is to say that is as good as an educationally devised system and the other would be to say that the latter is as poor as the former. But either way, he suggested, it should give pause to critics to know that the students' Web site "does correlate with a respected tool."

William H. Pallett, president of IDEA, said he was "surprised a bit" by the correlation between his organization's rankings and those of That's because much of the criticism he has heard of the student oriented site is that rankings aren't representative, while much of the effort at IDEA is based on assuring representative samples.

"I am surprised, given that we do attend to issues of reliability and validity and they acknowledge that they don't," he said.

Pallett cautioned, however, that IDEA is not intended to be a sole basis for evaluating a course or professor. He said that he would always advise departments to have professors evaluate on another, and to use student evaluations as just one part of that review.

Sonntag said that his current institution uses a home-grown student evaluation system, and that he has no plans to seek a change to IDEA or -- and that the evaluation system is covered by a collective bargaining contract anyway. But he said that he hoped the study might prompt some to think about the online rankings in new ways.

For his part, Sonntag acknowledged that some reviews are "so mean-spirited" that they aren't worth anyone's time. But he said that if you cast those aside, there are valuable lessons to be learned. He said that he does check what the site says about his teaching -- and has found reinforcement for some innovations and reason to question whether some of his tests were too difficult.

"I've been an instructor for 10 years. I look at it," he said, adding that he has found insights "that weren't on my teaching evaluations and I have thought: 'Wow. I believe what the student has said is valid and perhaps I can change the way I teach."


Be the first to know.
Get our free daily newsletter.


Back to Top