U.S. News & World Report today will be issuing this year's rankings of graduate departments, based in most cases entirely on a reputational survey, and various universities will promptly boast about being No. 1 in this discipline or that. But a new rankings system for Ph.D. programs also went live this week -- and it is a system in which no department can claim to be tops in anything.
The new system -- offered by PhDs.org -- takes a different approach: It provides free access to information about more than 5,000 programs at more than 400 universities. But the potential applicant has to decide how to weight the information. Are you more concerned about enrolling in a program with many minority or female students than you are in a program with low tuition? Are you more interested in the average time to finish a doctorate or the prestige of the faculty? Do you care more about the proportion of students who receive fellowships or the percentage who find a job after they earn degrees? After ranking the relative importance you place on these and other factors, the database produces a customized ranking of departments, indicating both a total ranking and how departments placed in the various criteria selected.
The new ranking system was unveiled this week without much fanfare, and several experts on graduate admissions said that they were only now learning about it. But the data sources come from information universities report to the U.S. Education Department, to the annual Survey of Earned Doctorates, and to the National Research Council. Geoff Davis, a Ph.D. in mathematics who taught at Dartmouth College before becoming a software and education consultant, created the database, with advice from a team of experts on graduate education and support from some big name entities such as the Alfred P. Sloan Foundation and the Burroughs Wellcome Fund.
Some of the limitations of the project are obvious. The NRC rankings have plenty of critics and are only updated every decade or so -- a process going on now -- meaning that they are old and their methodology is about to change. In addition, data on degree completion and job placement are only available for groups of at least five, so the information isn't as complete for smaller departments.
On the other hand -- and this is relevant given the current furor over how U.S. News may handle situations in which colleges can't provide certain information used in its rankings -- these new rankings give users the choice of how to respond when some information isn't available for a department. The user can decide to use an average figure for the field being examined, to skip that data point and to rank without it, or to simply list separately those programs that don't have complete data -- but to still report data that are available.
In an interview Thursday, Davis sounded a little bit like Margaret Spellings in describing why he wanted to create the new system. When he was looking at graduate schools, he said, he had primarily reputation to go on, but no "outcomes data" on what actually happened after graduate school. People need "realistic information about the kinds of careers that they will have," he said.
At the same time, Davis was a bit unlike the education secretary in that he insisted that it wasn't appropriate for anyone to declare that some measures should be used in evaluating programs. The whole point of his approach is that "people should go in with their own expectations of what they want to get out" of a graduate program. Factors that are important to some are irrelevant to others, he said.
"It doesn't make sense to call any department No. 1. They all have strengths and weaknesses," he said.
Davis did acknowledge that rankings can have an impact on the behavior of departments and institutions. But he said that was only a good thing, if his rankings take off. Some of the factors in his database -- time to degree or job placement -- aren't widely known. If departments start to pay more attention to these factors because of concern over how they will look, that will only help those graduate students who enroll, Davis said.
He also acknowledged that rankings -- even customized ones like his -- can only go so far at the Ph.D. level, where a graduate student's experience may hinge to a large degree on one or two professors in a given specialty. "Nobody should take these rankings, or any rankings, as a be all and end all," he said. His approach -- by avoiding any ability to be on the top -- is designed to reinforce that. "These should help people figure out which programs they need to really investigate, which programs they should visit," he said.
Joan F. Lorden, provost of the University of North Carolina at Charlotte, has written extensively on the limitations of the NRC rankings, and served on an advisory committee for the new system Davis developed. Lorden said that there are all kinds of problems with past systems, which tend to rely on reputation. "Those judgments can be based on 'halo effects' or misinformation or limited information," she said.
The new approach, she said, "is a really interesting tool" and one that is "empowering to students" by letting them decide what is most important for themselves. Lorden said she likes the way this system encourages students to consider and include a variety of factors in making a choice about which graduate programs to consider. "Anything that is driven totally by one variable just isn't going to be helpful," she said.
Robert J. Morse, director of data research for U.S. News, said that some people might find the new approach useful, but he questioned its value. Given the time lag in NRC rankings, he said, some of the statistics involved are likely to be 15 years old -- something his magazine would never tolerate.
Morse also noted that there are many departments for which the new system doesn't have information on job placement and other qualities that potential students would want. "I think it's going to be disappointing to people" who think they will have more data than actually exist.
As for the criticism that alternatives are needed to the reptuation-based approach of U.S. News, Morse made no apologies for a focus on reputation. "The coin of the realm in academia is reputation and standing," he said. "So to act like that means nothing -- people are being blind to their own world."
Read more by
Today’s News from Inside Higher Ed
Inside Higher Ed’s Quick Takes
What Others Are Reading