Assessing Learning Outcomes
“There is inadequate transparency and accountability for measuring institutional performance, which is more and more necessary to maintaining public trust in higher education.“
“Too many decisions about higher education -- from those made by policymakers to those made by students and families -- rely heavily on reputation and rankings derived to a large extent from inputs such as financial resources rather than outcomes."
Those are the words of the Secretary of Education's Commission on the Future of Higher Education, which on Tuesday handed over its final report to Secretary Margaret Spellings.
Less than a week before Spellings announces her plans to carry out the commission's report, a panel of higher education experts met in Washington on Wednesday to discuss how colleges and universities report their learning outcomes now and the reasons why the public often misses out on this information. On this subject, the panelists' comments fell largely in line with those of the federal commission.
The session, hosted by the Hechinger Institute on Education and the Media, at Columbia University's Teachers College, included an assessment of U.S. News & World Report's annual college rankings, which critics say provide too little information about where students learn best.
“The game isn’t about rankings and who’s No. 1,” said W. Robert Connor, president of the Teagle Foundation, a group that has sponsored a series of grants in "value added assessment," intended to measure what students learn in college. Connor said colleges should be graded on a pass/fail basis, based on whether they keep track of learning outcomes and if they tell the public how they are doing.
“We don’t need a matrix of facets summed up in a single score,” added David Shulenburger, vice president of academic affairs for the National Association of State Universities and Land-Grant Colleges.
What students, parents, college counselors and legislators need is a variety of measuring sticks, panelists said. Still, none of the speakers recommended that colleges refuse to participate in the magazine's rankings, or that the rankings go away.
"It's fine that they are out there," said Richard Ekman, president of the Council of Independent Colleges. "Even if it's flawed, it's one measure."
Ekman said the Collegiate Learning Assessment, which measures educational gains made from a student's freshman to senior year, and the National Survey of Student Engagement, which gauges student satisfaction on particular campuses, are all part of the full story. (Many institutions participate in the student engagement survey, but relatively few of them make their scores public.) Ekman said there's no use in waiting until the "perfect" assessment measure is identified to start using what's already available.
Still, Ekman said he is "wary about making anything mandatory," and doesn't support any government involvement in this area. He added that only a small percentage of his constituents use the CLA. (Some are hesitant because of the price, he said.)
Shulenburger plugged a yet-to-be completed index of a college's performance, called the Voluntary System of Accountability, that will compile information including price, living arrangements, graduation rates and curriculums.
Ross Miller of the Association of American Colleges & Universities said he would like to see an organization compile a list of questions that parents and students can ask themselves when searching for a college. He said this would serve consumers better than even the most comprehensive ranking system.
The Spellings commission recommended the creation of an information database and a search engine that would allow students and policymakers to weigh comparative institutional performance.
Miller also said he would like to see more academic departments publish on their Web sites examples of student work so that applicants can gauge the nature and quality of the work they would be doing.
Search for Jobs