The Voluntary Framework of Accountability , a project that aims to create national metrics gauging how well two-year institutions serve their students and fulfill their assorted missions, unveiled stage one  of its proposed measures for pilot testing last week.
Formally introduced two years ago, the VFA is managed by the American Association of Community Colleges and funded by the Bill & Melinda Gates Foundation and the Lumina Foundation for Education. The project has attracted the attention of educators  who have long been critical of the federal government's main method of judging community colleges: the three-year graduation rate of first-time, full-time students. Still, some educators have been leery of the project, given the wide range of community colleges' missions, demographics and funding formulas across the country.
Measure for Measure
Last week, the VFA announced 40 pilot institutions that will test its custom metrics. The community college testers are located in 29 states and include 37 individual institutions, two statewide systems and one multi-college district. AACC recently published the draft technical manual  that these pilot institutions will use to collect data, so that other institutions can calculate their own proposed VFA outcomes and submit critiques of them.
Prong one  of the proposed data to be collected looks at an institution’s “college readiness measures,” such as the proportion of students who complete all developmental education; “progress measures,” such as the percentage of students retained from one semester to the next; and “outcomes and success measures.” These latter will count, separately, those who earn an associate degree, certificate or other credential; those who transfer to a four-year institution without a degree or credential; and those who leave in both good and bad academic standing.
Prong two of the proposed data to be collected focuses on an institution’s “workforce, economic and community development measures .” The data points include “career and technical education measures,” such as the median wage growth of students who complete a program; “non-credit courses,” such as the number of state- and industry-recognized credentials awarded; and “adult basic education/GED measures,” such as the proportion of students who complete a GED, enroll in more postsecondary education or gain employment.
The final, and most debated, prong of the proposed data to be collected aims to assess student learning outcomes . Just how the VFA will make this measurement is still being determined; however, the framework does provide a list of “existing national benchmarked measures of student learning outcomes” that pilot institutions may use, including the Community College Survey of Student Engagement , among others. VFA is also challenging “the field” to develop student learning outcomes assessments that "meet criteria of relevance to community colleges and their unique student populations.”
The VFA, according to its organizers, is a work in progress. And they argue that this pilot testing will help highlight which measurements are most fair and practical in judging community college success.
Kent Phillippe, AACC's senior research associate, stressed that “these are not the final measures by any stretch of the imagination.” He admitted that some of the measurements being required are not commonly used or may not even be calculable yet.
“The conversation we had was, 'Let’s define things that are important to community colleges, even if we don’t have the data to support those things yet,' ” Phillippe said. "Still, we want to get a sense of, can they collect it? Then, if they think it’s important, we’ll put placeholder measures out there until we can get that data made available.”
Addressing some of the early criticism  the voluntary framework has received, Phillippe said, “We’re not trying to recreate the [Voluntary System of Accountability ],” a joint effort of the two main associations of public four-year colleges and universities: the Association of Public and Land-grant Universities and the American Association of State Colleges and Universities.
“For example, we’re looking at career and technical education,” Phillippe argued. “We’re looking at progress measures and not just at outcome measures. We’re focusing on developmental education. We’re very different from the VSA, which is very much about the endpoint. Also, we’re still finding ways to look at student learning outcomes in a constructive way.”
Phillippe also noted that while the VFA is collecting measures that are “consistent across all institutions,” that does not mean that it is “always appropriate to compare them.” For instance, he said that a career and technical college may double-count some students in prong one, “the student progress and outcomes measures,” and in the second prong for assessing only its technical programs, but still decide that the latter data set is a better judge of its success when comparing it to a peer.
Potential Federal Impact
Officials affiliated with the accountability framework, including Phillippe, acknowledge their hope that the VFA ends up influencing whatever federal reporting measures are recommended by the Committee on Measures of Student Success  — a group that was created by the Higher Education Opportunity Act of 2008, and whose 14 members were appointed by Education Secretary Arne Duncan. The panel met for the first time  last October.
Some members of the education secretary’s committee are closely following the VFA’s work and hope to learn from it. Among them is Wayne Burton, president of North Shore Community College, in Massachusetts.
“It’s certainly ambitious,” Burton said of the voluntary framework. “I mean, they’ve got an awful lot of data to collect. I think that’s significant. I’ll be interested in how they do. Obviously, that should be fed into our committee. We’re kind of running on parallel tracks.”
Burton also applauded the “diverse” list of institutions that are pilot testing the program, and its attempt to measure work force success. Still, he said that the final report of his committee, which is due this fall, would be unlikely to mirror VFA exactly.
“This is such a complete list of data,” Burton said of VFA. “It’s unlikely what we produce [in committee] will touch on all of these.... We also have a public mandate to simplify this so that the public can understand it.”
Some participants in the VFA pilot acknowledge that its potential influence over any federal policy is at least part of the reason for their participation.
“We want to be part of that national conversation, and it’s an important one to have,” said Christina Whitfield, director of research and policy analysis at the Kentucky Community & Technical College System. “National benchmarking comes up all the time, so I would hope this would have an impact and synch up with emerging federal requirements. Hopefully we won’t generate lots of different metrics that we have to end up following.”
DeRionne Pollard, president of Montgomery College, echoed that sentiment.
“I’ve been supportive of and have embraced VFA for two specific reasons,” Pollard said. “One, I’m a great proponent of community colleges taking the opportunity to define ourselves and really demonstrate who we are. Community colleges have been traditionally defined by other people and, quite frankly, their definitions are not always adequate. Two, this recognizes the breath and depth of offerings we have in our house.… Any good policy maker should probably want to know how this sector defines itself.”
Data Collection Overload
Even officials from institutions that are pilot testing the VFA are concerned about data collection.
For example, Whitfield noted that KCTCS would be unable to provide the VFA with information about the state’s adult basic education and GED systems because, even though they are housed in many community colleges, they are run by the state government. She also expressed concern about her system’s ability to get through all of the VFA metrics in the time frame requested, before the spring AACC convention.
Joe May, president of the Louisiana Community & Technical College System, whose system is a pilot tester of VFA, remains concerned about how it will measure work force development programs against one another, as some states will have a hard time providing data about the employment and wages of students who have finished technical programs because their data systems are not connected. He also expressed concern about the difficulty of gathering information about noncredit work force development courses, which constitute a large proportion of the offerings at some technical colleges.
Still, May said that the VFA, even in its draft form, is better than nothing at all.
“This is a step in the right direction, and I think it’s being done the right way,” May said. “I see it as an important concept and one that is, in and of itself, a dramatic improvement over what we’ve been doing. It’s just a work in progress right now.”
Despite a rosy outlook from those involved, the VFA does have its critics on the outside looking in, who argue that its focus on student completion is bad for the community college sector.
“I note VFA’s continued and seemingly primary emphasis on student throughput even as the ubiquitous news today … questions, persuasively, the quality of learning happening at our institutions of higher learning,” wrote Gary Brown, director of the office of assessment and innovation at Washington State University, in an e-mail. “Throughput is not competence. Retaining and graduating more students more quickly is a narrow view of quality learning. It is also problematic that the VFA's proposed outcomes measures continue to target institutional comparisons, though studies confirm there remains more variation within institutions than between them, and there is little evidence that students ... comparison shop based on test scores they or employers value little if at all.”
Brown, who is also the head of the Center for Teaching, Learning & Technology at Washington State, cites the American Association of Colleges & Universities' Liberal Education and America’s Promise  initiative and e-portfolios as “more useful ways to address legitimate needs for accountability that engage students and faculty in embedded and authentic projects.” He argued that “their utility and potential” has not been fully grasped by many educational leaders.
“Projects like the VFA and the VSA, though they purport to engage the good will of the educational community by virtue of their voluntary nature, are much more likely to obtund or eclipse the potential of more creative models that have better validity and, more importantly, much greater utility,” Brown wrote.
Susan Twombly, professor and chair of the department of educational leadership and policy studies at the University of Kansas, also expressed a number of concerns about the VFA. Among those she shared via e-mail are:
- “Do ordinary community colleges have the capacity to produce the data required of this system without Lumina or Gates funding?”
- “Who will use the results of VFA and how? This is not entirely clear from the website. If low stakes and for local use, colleges might comply honestly. If the stakes are high, how will colleges react? Will they start taking different students so their data look better? Will they drop remedial education because those numbers make a college look bad?”
- “Assuming a nationally standardized assessment instrument can be developed, will students take it and take it seriously? Will colleges end up teaching to the test? Will the test determine what is taught?”
According to the official VFA schedule , pilot sites are to submit initial feedback on the proposed measures and framework in March. Then, in April, the VFA board plans to vet the measures and further modify the framework.