The results are in for U.S. News & World Report’s first attempt to rank online programs at colleges and universities, but the jury is still out on how well the publication has managed to assess the programs on its first try.
The inaugural online rankings  assessed bachelor’s degree programs, as well as master’s programs in business, engineering, nursing, computer and information technology, and education. The survey suffered from some data collection woes  that leave much to be desired -- and criticized. U.S. News nevertheless has published its first attempt to render the nebulous cloud of online higher education as a series of hierarchies. Let the chest-thumping and methodological takedowns begin!
U.S. News scored the bachelor’s degree programs based on three categories: faculty credentials and training, student engagement and assessment practices, and student services and technology. Each category contained several weighted metrics. For example, the faculty category gave the greatest weight to programs whose instructors held terminal degrees and had two or more years of online teaching experience. (Institutionally financed training and peer-review mechanisms were good for some points, but not as many.)
The graduate programs were scored using similar categories, plus another aimed at measuring the rigor of the admissions process. (Undergraduate assessment did not take into account selectivity.) Mean undergraduate grade point averages and scores on standardized entrance exams were also taken into account. So too were online graduate programs’ standing with relevant accrediting agencies. For example, 20 percent of a business program’s score for the “student engagement” section turned on whether it had passed muster with AACSB International: the Association to Advance Collegiate Schools of Business.
Still, for all that accounting, the publication struggled with some predictable yet dismaying obstacles. U.S. News, which has ranked brick-and-mortar institutions and their graduate programs for decades, found it challenging to collect data on fully online programs.
U.S. News cast a wide net when it set out to collect data for the rankings last July. It e-mailed surveys to thousands of institutions and got hundreds of responses, but the number of institutions that returned usable surveys was relatively small for each category: 184 for bachelor’s programs, 164 for business programs, 161 for education programs, 83 for nursing programs, 59 for engineering programs and 26 for computer and information technology programs.
The company’s scattershot approach might have hurt the response rate. Several institutions that did manage to work through the survey said it was very long and time-consuming.
Other data collection problems had to do with the “idiosyncrasies of online programs,” said Robert Morse, the director of data research for U.S. News.
The company used the Sloan Consortium’s definition of “fully online”: 80 percent or more of the educational experience happens on the Web. The problem is that many graduate programs have some fully online courses and some blended courses, and did not have separate data for the “fully online” pieces of the curriculum, said Eric Brooks, lead analyst for the rankings.
As a result, the U.S. News rubric for scoring and ranking the programs was built based on the data the publication's researchers managed to get, not what it would have liked to use in the best of circumstances, Morse said. Crucial metrics such as retention rates, graduation rates, learning outcomes, debt incursion and repayment, and success in the job market were left out of the equation because not enough institutions were willing or able to provide that information.
Vesting authority in data that institutions were most able and willing to supply is hardly a sound methodology, said Russell Poulin, deputy director for research and analysis at the WICHE Cooperative for Educational Technologies. Institutions suffering from “survey fatigue” might have been unfairly penalized, Poulin said.
Still, among the admittedly small pool of respondents, “We do think these rankings are proxies for program quality,” said Brooks. While metrics such as instructor qualifications and availability, class size, assessment mechanisms, anti-cheating vigilance, accreditation and selectivity might not be as strong indicators of program quality as completion, learning outcomes and career success, it is still plausible to think they might be correlated, he said. And those “strong indicators,” while preferable for the purposes of ranking, have been difficult for anybody to pin down in any broadly comparable way.
Kevin Carey, the policy director for Education Sector, noted that the small number of institutions that managed to qualify for distinction nearly defeats the purpose of ranking them.
However, he still thinks the survey sends an important signal by asking the right questions about a segment of higher education that receives little scrutiny on academic quality. “I’m glad that U.S. News is doing this,” said Carey.
If nothing else, U.S. News might give some profile, on the basis of something resembling merit, to institutions that do not have millions of dollars to pour into marketing and advertising.
“It’s a little hard to get above the fray, because so many people are marketing their online programs,’” said Mary Bonhomme, dean of online learning at the Florida Institute of Technology, which earned a place on the U.S. News “honor roll” for online bachelor’s programs.
Paul Fain contributed reporting.
For the latest technology news and opinion from Inside Higher Ed, follow @IHEtech on Twitter .