There’s a lot of talk these days inside the D.C. Beltway about “fake news.” While the term itself is not new (Merriam-Webster says it has been around for more than 100 years), fake news has become enough a part of the lexicon that last month Dictionary.com announced that it will include a definition in its next update.
The list of media outlets accused by the Trump administration of being purveyors of fake news is an impressive club, including The New York Times, CNN, The Washington Post and NBC. But, interestingly enough, one media giant headquartered in the nation’s capital has avoided scrutiny.
Until now, that is. It’s time to ask whether the U.S. News & World Report college rankings should be considered fake news.
What you consider fake news depends on your definition. For certain individuals, unflattering news reports from reputable news organizations constitute fake news, even when true. A different kind of fake news comes from internet sites devoted to posting false, sensational news stories such as Pizzagate, in which Hillary Clinton was alleged to be running a child-sex ring out of a D.C. pizza parlor. Shows like Entertainment Tonight thrive on another kind of fake news, a menu of gossip and entertainment industry self-promotion under the guise of news.
The U.S. News college rankings don’t fit any of those categories. The argument for the rankings being fake is that U.S. News is not reporting news but rather creating it. On the day I flew to Boston for the National Association for College Admission Counseling conference, my local newspaper ran its annual story about the appearance of the rankings, assigning significance to a two- or three-spot improvement or decline in some institution’s ranking. But is some small change in ranking significant, and if the significance is fake, doesn’t that make the news fake as well?
I am not among those in our profession who see the U.S. News rankings as ground zero for all that is wrong with college admission. At the same time, I wouldn’t characterize them as a positive or even neutral influence. Much of the admissions gamesmanship practiced by colleges is ultimately gamesmanship of the rankings.
Last month Politico published an interesting, in-depth article accusing the U.S. News college rankings of promoting economic inequality on college campuses. The article contrasted the fact that a recent report from the Equality of Opportunity Project showed that many of the nation’s top colleges admit more students from the top 1 percent of earners than the bottom 60 percent, and that list contains the colleges that U.S. News ranks highest.
The Politico article argues that many of U.S. News’s measures of educational quality are actually measures of affluence, both among the institutions themselves and the students they enroll. That is not a new insight, but Politico does a great job of connecting the dots and showing how some of the criteria used by U.S. News may lead colleges motivated by rankings to develop admission policies that disadvantage students who are already disadvantaged socioeconomically.
Here are some examples:
- Student selectivity, as measured by standardized test scores, class rank and admit rate. We know there is a strong correlation between family income and test scores, with students in the top income bracket scoring much better, on average, than those in the bottom income bracket. The ranking emphasis on test scores is an incentive for colleges to admit students from privileged backgrounds or to use merit aid to buy students with strong scores. Colleges have learned how to lower admit rates by increased use of early decision, and that hurts students who can’t apply early because they need to compare financial aid offers.
- Faculty and student resources. In the Politico article, F. King Alexander, the president of the Louisiana State University system, states that the key to improving an institution’s ranking is higher faculty salaries and per-student spending. He points out that this drives up tuition, which may discourage low-income students from applying. Carol Christ, chancellor of the University of California, Berkeley, adds that the easiest way to find resources for faculty and student spending is to spend less on financial aid by admitting affluent students.
- Alumni giving rate. U.S. News pitches this as a proxy for student satisfaction, arguing that alumni who felt they received a good education are likely to give generously. That may or may not be true. The easiest way to encourage alumni giving, though, is admission preference for children of alumni.
Politico contrasts the ranking experience of two institutions, Southern Methodist University and Georgia State University. Like a number of other institutions, SMU has built a strategic plan around improving its ranking and raised $1 billion to achieve goals such as higher faculty salaries and higher average SAT scores, and has raised its ranking 11 places since 2008.
At the same time, Georgia State has been a model for encouraging socioeconomic diversity, enrolling three times more Pell Grant recipients than all eight Ivies combined. It made a conscious decision to focus admission decisions on high school grades rather than test scores. Its average SAT score has dropped 33 points and its ranking has dropped 30 places in five years. Most of us would agree that higher education has a responsibility to address economic inequality, and yet the U.S. News formula penalizes colleges for living that responsibility.
My issues with U.S. News are more philosophical in nature, having to do with the assumptions it makes and the messages it sends about college admission.
I think there’s actually a lot of valuable information to be gleaned in the data that make up the rankings. I wish that U.S. News would present each category (alumni giving, average class size, per student spending, etc.) as discrete data, with information about what the average or 25th/75th percentile figures are, plus an explanation of what the particular category means when it comes to evaluating a college or university. Where U.S. News errs is in combining and weighting various factors to produce a comprehensive ranking, suggesting a precision that is unjustified, or even fake. I recognize that rankings sell, but does fake precision equal fake news?
The rankings also assume that the value of college consists in the name brand on the diploma rather than the college experience. That is because the rankings don’t, and can’t, measure the quality of the educational experience a student has. That experience is subjective and personal and life changing, and it is not measurable by the kinds of input factors relied on by U.S. News.
Attempting to rank colleges without measuring educational experience is like ranking America’s best churches without taking into account spirituality. If that’s not fake news, I don’t know what is.