In my last blog, I noted my high regard for the ratings and more importantly the objectivity of Consumer Reports. But there are so many goods and services not ranked by Consumer Reports or any other objective judge that in many cases we are left to our own improvised rankings or, and even worse, questionable third party judgments.
I have for many years used my own ratings system for rankings of hotels. Within a particular star category, I judge a hotel by the orange juice available at breakfast. If the juice tastes like it comes from watered down concentrate, I immediately downgrade the hotel; if the orange juice tastes fresh squeezed, the hotel rises in my opinion. Bathrooms are also often good proxies for the quality of a hotel. In a recent trip, the bathroom provided was so small that even my 15 pound dog would find the accommodations tight. Chocolate on the pillow, on the other hand, has turned out not to be a good proxy for hotel quality, though I do believe that providing chocolate mints is a good indication that the provider doesn’t understand and appreciate the richness and quality of chocolate.
In higher education we have many ratings of programs. Some use a variety of factors to make their judgments; others use very few. In a number of cases, the rankings are solely based on the opinions of peers, both administrators as well as faculty. I know there are, without question, many knowledgeable administrators and faculty but the system is still inherently and seriously flawed. In my years as dean, I could talk about a significant number of graduate and undergraduate programs based on a solid and significant knowledge base. What percentage of the total number of programs in the field does “significant” constitute? Realistically 5-10%. For some, the percentage would be higher and for others their knowledge base is less reliable. As a long serving provost who has done and continues to do accreditation visits, my knowledge base is even more limited given the much larger universe of schools. Without question, I have tidbits of information for many schools; ranking should not be based on tidbits.
Many of us already indicate the equivalent of “don’t know” for those schools or programs where our knowledge base is sketchy or worse. Some of us valiantly go through the entire list provided and check off a ranking for virtually every school listed. Please stop ranking any school or program that you don’t have detailed knowledge about. If the rankings are to have meaning they must be grounded in fact, reality and real knowledge. Absent that information, we should not respond. If all of us took the pledge to only evaluate based on the facts, when we have the facts, there would be far fewer responses to the ratings questionnaires but far more accurate and reliable information.