Is the College Scorecard Misleading Students?

By failing to account for where graduates live, its earnings information is flawed, argues Doug Williams.

April 12, 2018

President Barack Obama, in his 2013 State of the Union Address, boldly claimed that the College Scorecard -- the now three-year-old U.S. Department of Education internet portal -- would help parents and students “compare schools based on a simple criteria -- where you can get the most for your additional educational buck.” Fundamentally, what you get for your educational dollar depends on what price you pay and the earnings you receive.

The Scorecard’s net price information (i.e., the price net of aid) is a significant step forward in helping families understand what they have to pay. In contrast, the Scorecard earnings information -- median graduate earnings 10 years after enrolling -- is seriously flawed. Many of those shortcomings have been discussed at length elsewhere, and l won’t repeat them here. Instead, I will focus on a crucial deficiency that has received scant attention: the Scorecard earnings measure fails to account for where graduates live.

Cost-of-living indices show a wide variation in living expenses across cities and, hence, a wide discrepancy in the purchasing power of a dollar. Consider a college graduate in Nashville, Tenn., who earns $100,000. To achieve an equivalent standard of living in New York, the same worker needs almost $250,000. In San Francisco, $199,000. In Boston, $153,000. In Cleveland, $105,000. Generally, the cost of living is higher and the purchasing power of a dollar is lower on both coasts compared to the middle of the country.

To compensate for a higher cost of living, employers in more expensive cities pay higher salaries. Simple tabulations from 2016 Census data show that workers with a bachelor’s degree earn more in all the above cities compared to Nashville. For example, workers in Nashville earned $43,000 compared to $61,000 in Boston.

Since the College Scorecard reports the earnings of graduates but makes no adjustment for where those graduates live, it ends up providing misleading information about how successful the graduates of different colleges are in achieving a good standard of living. To illustrate why, consider hypothetical Colleges X and Y. The graduates of both colleges have, in the cities they settle in, identical earnings on average -- say, an average of $100,000 in Nashville and $150,000 in Boston. But suppose that 90 percent of College X graduates locate in Nashville and 10 percent locate in Boston, while 90 percent of College Y graduates locate in Boston and 10 percent in Nashville. The Scorecard average earnings will be $105,000 for College X and $145,000 for College Y. College Y appears to be a much better investment, but the reported difference is illusory.

My example is extreme, but the distortion is very real. Graduates of some colleges are far more likely to settle in expensive, usually coastal, cities than graduates of others -- not because of differences in college quality but simply because of geographic location. We can see that by using LinkedIn’s Alumni Tool, which I used to tabulate the percentage of graduates living in the 10 most expensive cities for several highly selective colleges and universities: 80 percent for Fordham University, 77 percent for the University of San Diego, 68 percent for Trinity College, 51 percent for Dickinson College, 21 percent for the University of the South and 14 percent for Centre College. It should be obvious that taking cost-of-living differences into account would radically change the rankings and the earnings differences between institutions. That is true not only for institutions with similar selectivity but also for those across selectivity categories.

The bias in the Scorecard earnings data has spilled over into a number of rankings by reputable publications that factor in Scorecard earnings. Few, if any, of them adjust earnings for purchasing power in a meaningful way. As a result, colleges sending a high fraction of graduates to either coast get a substantial, unmerited boost upward in the rankings.

There are, of course, many complaints about rankings methodology. But to emphasize how misleading these particular rankings are, consider a hypothetical, analogous ranking of “Best Cities to Work and Live” that factored in earnings but did not factor in the cost of living. No one would take such a ranking seriously because it would be highly deceptive for choosing location. That is why city rankings usually take the cost of living into account.

The good news is that we have a ready solution at hand. The Department of Education should use the residence data available on graduate tax returns and adjust the Scorecard earnings for cost-of-living differences using standard methods. Those adjusted earnings would provide much better information for prospective students and their families about “where you can get the most for your educational buck.”


Doug Williams is the vice president for finance and the Frank W. Wilson Professor of Economics at the University of the South.


Be the first to know.
Get our free daily newsletter.


Back to Top