- Aliens in Omaha
- Study raises questions about common tools to assess learning in college
- The Challenge of Value-Added
- Searching for the Holy Grail of learning outcomes
- New ranking system links colleges' and students' characteristics to graduate economic outcomes
- Virtues and Vices of 'Value Added'
- Nebraska university's 'three strikes' attendance policy off to promising start
- U.S. panel negotiates over rules for teacher preparation programs
Let the Assessment PR Wars Begin
Among the many reasons critics have cited for opposing the overuse of nationally comparable measurements of student learning outcomes is that they will become yet another oversimplified way of comparing one college's performance against another. Just look, they say, at how quick many colleges are to boast about how they fare in the annual U.S. News & World Report rankings, even as many academic leaders bemoan the rankings' attempt to sum up institutional excellence in a few (flawed, they argue) statistics.
Might not the same thing happen, they wonder, if colleges are pressured to try to summarize their students' academic performance in a single number drawn from a single test?
The answer appears to be Yes.
The University of Nebraska at Omaha issued a news release on Thursday with the provocative headline "UNO First in U.S. for Value-Added Education." University officials based their assertion on their students' performance last year on the Collegiate Learning Assessment, a performance-based examination sponsored by the Council for Aid to Education. The annual report that Nebraska-Omaha officials received from the council informed the university that it "contributes more to the learning gains made by students than 100 percent of the 176 four-year undergraduate institutions participating in the 2007–2008 CLA."
That led to UNO's boast, in the news release, that it "contributes more to the learning gains made by students than any other institution that participated in a recent national examination, including schools such as Duke University, the University of Texas, the University of Michigan, University of North Carolina and Arizona State University."
The CLA, which was championed by the Secretary of Education's Commission on the Future of Higher Education and has been embraced by several major higher education groups as one of a new generation of tools for measuring student learning, is controversial among critics for some of the same reasons that proponents find it attractive. First, it purports to measure certain kinds of learning -- general education skills such as critical thinking and problem solving -- that many traditional standardized tests do not.
Second, the way it is typically used -- administered to small (100-student) groups of freshmen and seniors -- aims to help colleges show the extent to which they have helped their students improve over the years, allowing them to show how much "added value" they have delivered over the course of students' careers at an institution.
Accountability advocates have encouraged the CLA's use because they hope it will become a nationally accepted way of helping students and families compare the success of different institutions in educating their students. Although the test's sponsors have argued that the CLA is designed primarily to inform colleges and universities about how they can improve their internal teaching and learning practices, CLA advocates have not aggressively opposed the examination's use as nationally comparable accountability tool.
But some experts on assessment, and some institutional officials who have used the CLA and similar tools in recent years, have cautioned about the limitations of "value added" measurements, noting among other things that such tests tend to closely track the incoming credentials (scores on SAT scores, etc.) of their students and that colleges may have perverse incentives to have their freshmen score poorly in order to make the performance of their seniors look especially good.
"A value-added score, calculated using the same methodology for all higher education institutions in America, would enable an institution with limited resources that admits students with very poor high school records and very low SAT scores but graduates students who have pretty good GRE scores (as an example of an exit exam) to get a 100 percent score because the improvement or value-added is large," John V. Lombardi, president of the Louisiana State University system, wrote in a 2006 essay for Inside Higher Ed.
"Colleges with superb facilities and resources that admit students with very high SAT scores and very fine high school preparation and graduate students with very good GRE scores could get a 50 percent score because the improvement measured by the tests would be modest (from terrific coming in to terrific going out). Then, in the national rankings, the first institution could claim to be a much better institution for improvement than the second one."
Nebraska-Omaha has done just that. The university's interim assistant vice president for academic affairs, Steve Bullock, said in an interview that the 225 randomly selected freshmen who took the CLA last fall performed less well (finishing in the 9th percentile of all test takers) than they were projected to based on their average composite ACT score of 22. The 98 seniors who took the CLA last spring, by contrast, scored better on the CLA than projected, scoring in the 86th percentile.
As a result, Nebraska-Omaha's "value added estimate" in its CLA report from the Council for Aid to Education found it to have scored in the 100th percentile among the 176 participating colleges in the difference in performance between its freshmen and seniors.
Bullock said he and his colleagues at UNO know that the CLA results are "not a definitive measurement of how we're doing, and that we see it as a means to spark further inquiry into what we're doing." Officials said they would use the results to explore why the freshmen the university admits appear to be underperforming compared to their academic credentials.
But Nebraska-Omaha officials also decided they could not pass up the opportunity to do a little boasting about scoring so highly on the CLA's value added scale, Bullock said. "Our administrative leadership felt it was important to get this sort of information out," he said. "We haven't done a very good job historically of promoting what we do at the institution, and it's pretty rare that an institution like ours gets to be No. 1. Of course, the CLA is skewed in favor of schools like ours where students come in and have some room for growth, whereas Harvard or Yale, regardless of what you do, have very good students coming in and they're going to be smart when they leave."
Marketing or Misuse?
Nebraska-Omaha's boasting about its CLA scores did not surprise some college leaders and assessment experts, given the sort of self-promotion that many colleges do about their finish in the U.S. News rankings. Richard Ekman, president of the Council of Independent Colleges, said in an e-mail message that his association's Collegiate Learning Assessment Consortium has focused "on getting colleges to use the CLA to improve their teaching and learning effectiveness," and to "trust one another enough to share results and learn from one another about uses of the CLA to diagnose weaknesses and find solutions."
But "we fully expect a college that happens to score very well to boast about it," Ekman said. "That's what UNO is doing, and I don't fault the university for playing up a distinctive characteristic. UNO may be pushing it a little far -- to imply that if you enroll there, you're guaranteed to learn more than at ABC University -- but that's the admissions game these days."
Carol Schneider, president of the Association of American Colleges and Universities, said in an e-mail message that her group's Liberal Education and America's Promise and Valid Assessment of Learning in Undergraduate Education programs have encouraged colleges to use a broad array of learning measures, including electronic portfolios, to "guide all students, not just a tiny sample, to show that they have achieved the most important outcomes of college." The CLA, "like any standardized test, can be part of a comprehensive strategy for tracking outcomes, but it is much too narrow and much too generic to become a single arbiter for educational quality," Schneider said. "I worry that the CLA will become our apple in the garden of assessment -- tempting us to settle for simple metrics when we all know very well that learning is multi-faceted and -- at the most advanced level -- grounded in the students' actual field of study."
Although the Council for Aid to Education may have encouraged Nebraska-Omaha's crowing by informing the university that it finished in the 100th percentile of participating colleges, council's officials bristled at UNO's news release.
"We strongly discourage schools from doing things like this," Roger Benjamin, president of the Council for Aid to Education, said in an e-mail message. "Ranking schools is not our goal," added Jim Hundley, the council's executive vice president and chief operating officer. "We're not trying to identify who's the best in the nation. What we encourage schools to do is to use their results as a signal and use it with other data, including testing other than the CLA. But how people use the results is not something we have control over."
Given America's love for rankings and the thirst for readily comparable (and sometimes oversimplified) measures of quality, however, Nebraska-Omaha's use of the CLA could be the next battleground in the college rankings competition.
"The danger would come if a federal official (or a USA Today) were to read the release from UNO and get the bright idea to require all colleges to reveal their scores and to create a single ranked list that was presented as if it were the only measure of quality," said Ekman, of the independent college council. "That's what's so troublesome about the U.S. News rankings.
"So, I'd say to UNO: use what you've got; but recognize that you may be encouraging Big Brother."
Search for Jobs