You have /5 articles left.
Sign up for a free account or log in.
College rankings, such as those published annually by U.S. News & World Report, are typically thought to factor into a consumer's perception of an institution. Plenty has been written about how high school students, parents and college counselors respond to the list. But what about state legislatures?
In the first draft of a working paper, two economists argue that for public institutions, a correlation exists between inclusion in the U.S. News rankings and a rise in state funding per student. “The Power of Information: How Do U.S. News Rankings Affect the Financial Resources of Public Colleges?," available through the National Bureau of Economic Research, is intended to offer a snapshot of how news media coverage of higher education influences public policy.
Using information primarily provided by the Integrated Postsecondary Education Data System (IPEDS), and looking at data from 1987 to 1995, the researchers considered 433 public four-year colleges and whether they were included or excluded from the U.S. News rankings in each year. In 1990, the publication extended coverage of its "America's Best Colleges" issue to include national and regional colleges that hadn't made previous lists. (Note: This article has been changed from an earlier version.)
The report classifies colleges in one of three categories: Those that were in the rankings before 1990 ("previous-ins"), those that were first added in 1990 ("added-ins"), and those that were never in before 1995 ("never-ins"), when the rankings expanded again.
Because U.S. News provided separate rankings for national and regional colleges ("national" is defined by Carnegie Classification), "prevous-ins" include both national top 25 and top regional colleges. By comparsion, "added-ins" are colleges that were deemed to be of national quality but that did not reach the top 25 before 1990.
In 1987, state funding per student was on average higher for the "added-ins" than for the other two categories. That trend line continued in 1995. The researchers found that state funding increased an average of 58 percent from 1987 to 1995 for colleges that first appeared in the rankings in 1990. By comparison, state funding increased 49 percent for colleges that were never ranked and 48 percent for those already on the list.
The report says that the increase in state expenditures attributable to the U.S. News exposure amounted to 6.5 percent per student, while the exposure did not seem to have an impact on the institutions' tuition rates. States with the largest pre-college age population, voter turnout and U.S. News newsstand sales were most likely to see the greatest rise in state appropriations, according to the paper.
"The idea is that you have this information out there that is read by the whole public," said one of the paper's co-authors, Ginger Zhe Jin, an assistant professor of economics at the University of Maryland at College Park. "What we see in the data is that states where people are more actively looking at the rankings are the ones where the funding increase follows." (Alex Whalley, an assistant professor of economics at the University of California at Merced, is the other co-author.)
The authors found that colleges that were first added into the 1990 rankings saw state funding go toward increases in scholarships and instruction, and less on research. Jin said one explanation is that the algorithm used by U.S. News has inputs that emphasize quality of the incoming student, which would motivate legislators to use scholarships to woo the best students. It also takes into account faculty-to-student ratios, which would explain instruction expenditure increases, Jin said.
For the "added-in" group, private funding attributable to the U.S. News exposure decreased by an average of 11 percent from 1987 to 1995. The report says that when a college appears in the ranking, donors might figure that state expenditures will increase and thus not give. Conversely, colleges might reduce their marketing efforts to potential donors, according to the report.
Jin said she is confident of the cause-effect correlation in the paper because the research looks at the "added-in" group against the control groups (those characterized as "previous-ins" and "never-ins.")
Jin said because the paper looks only at one time period, she isn't able to draw a conclusion that a college that moves up in the rankings these days should expect to see more state funding. "I can see arguments going both ways," she said.