College leaders and Education Department officials have spent much of the last two years talking past each other on the subject of measuring student learning. Critics have accused the federal government of pushing an overly simplistic, inflexible approach that emphasizes at all costs the ability to compare one college's performance. Education Department officials have increasingly insisted that they are not pushing a one-size-fits-all model, but they have also regularly said that they believe too few institutions are aggressively pursuing their desired agenda.
Recent weeks have brought some signs that cooperation may be replacing conflict. Last month, the Education Department awarded a $2.4 million grant to three higher education associations  to assess existing, and develop new, tests and other tools to measure student outcomes on a wide range of skills. And today, Under Secretary Sara Martinez Tucker, as part of a national tour on college issues, will watch as faculty members and students at Miami Dade College sign a "covenant"  in which they pledge to embrace the two-year institution's new "learning outcomes" initiative.
It would be easy to dismiss the Miami Dade event as a public relations gimmick, but several aspects of it are noteworthy. It underscores the fact that as a lot of college officials have been saying all along, many institutions have been wrestling for years with finding thoughtful, creative ways to gauge the success of their own students. It shows that faculty members -- who played a central role in developing some of the home-grown tests and tools that Miami Dade is using -- are willing to engage in the hard work of holding themselves accountable. And it suggests, too, that Education Department officials are indeed primarily interested, as they have repeatedly avowed, in seeing colleges embrace the accountability movement, even if they don't use methods that are as transparent and as readily comparable as the agency's leaders might hope.
"You've heard [Education Secretary Margaret Spellings] say that we don't want to do it to anybody," Tucker said by telephone from Miami Thursday. "We want to encourage them to show leadership and do this. Am I endorsing [Miami Dade's] approach? No. What I'm here to do is say i applaud that you're getting the ball rolling."
Miami Dade, which with 160,000 students (half of whom attend for credit) is among the nation's largest institutions, has spent two years producing a system for showing whether its students are ending their time at the college (in the classroom and outside it) not only with knowledge in their chosen fields but with a grounding in the sort of general education skills that employers want and citizens need. (Some of the work was conducted in conjunction with the Association of American Colleges and Universities' Liberal Education and America's Promise  program.) Through a process that involved professors, staff members, administrators, employers and alumni, the college developed a list of 10 desired "learning outcomes" for students. It wanted all of them to be able to:
- Communicate effectively using listening, speaking, reading and writing skills.
- Use quantitative analytical skills to evaluate and process numerical data.
- Solve problems using critical and creative thinking and scientific reasoning.
- Formulate strategies to locate, evaluate and apply information.
- Demonstrate knowledge of diverse cultures, including global and historical perspectives.
- Create strategies that can be used to fulfill personal, civic and social responsibilities.
- Demonstrate knowledge of ethical thinking and its application to issues in society.
- Use computer and emerging technologies effectively.
- Demonstrate an appreciation for aesthetics and creative activities.
- Describe how natural systems function and recognize the impact of humans on the environment.
Once college officials agreed on the outcomes, the next task was finding ways to measure how students fare. Miami Dade found some ready-made tools that suited its purposes, like the Community College Survey of Student Engagement and an existing information literacy test. But given that the outcomes they were seeking to instill were specific to Miami Dade, the institution's officials found that they needed home-grown ways to measure them, too.
"We don't think there's any test in the world that would be as specific as what we're trying to measure," said Norma Martin Goonen, provost for academic and student affairs at Miami Dade. "We trust that our faculty is in the best position to say, 'These are the things [we want to measure] and these are the ways to measure them.' "
All told, Miami Dade's system for measuring learning outcomes includes a half-dozen different tools. As is the case at many institutions, Miami Dade professors (often in consultation with their academic departments) will "embed" discipline-specific learning goals for students in each course. Certain majors will use student electronic portfolios to show student progress. The college will also use data from the survey it gives to all graduating students each year to round out the picture.
The newest and most central tool in Miami Dade's array of measures is a set of faculty-developed and faculty-graded "authentic assessment" tasks (typically defined as problems or essay prompts that reflect real-life situations) that are designed to gauge students' skills in writing, critical thinking, and problem solving, among other capabilities. Miami Dade gave the tests to a random and representative 18 percent sample of graduating students last year, by having professors use class periods to administer them, a tactic it found effective. "We thought about throwing a pizza party as an enticement," said Goonen, but administrators ultimately decided to mandate the test for a certain group of students rather than seeking volunteers.
Miami Dade officials plan to compare students' results on the assessments from year to year, and because the college is in the process of "mapping" where in the curriculum and in outside activities students are supposed to have picked up the various skills, "if we find that students didn't do well in a particular outcome, we can look back at the curriculum and co-curricular activities and diagnose where the curriculum needs to be strengthened or reinforced," said Goonen.
Goonen said she recognized that using more standardized measures would put Miami Dade's approach to assessment more in line with the Education Department's push for comparability, a cause championed by the Secretary of Education's Commission on the Future of Higher Education.  (The college also plans to use its results primarily to guide its own internal work, rather than make them public, as the Spellings Commission has urged.)
But "we don't think any standardized test that exists now will approximate what we're trying to measure, and it certainly won't be developed by our faculty, which we think is key," she said. "It's important to them and to us that they came up with the assessment of these, and that they're responsible for carrying them out. That's really where it's at."
Tucker reiterated that department officials are not counting on having colleges all use the same measures. "When we said comparability, it's not that every campus has to use the same technique," she said. "It's that consumers should be able to compare what colleges do, and they should report things in ways that students can understand. It's going to take time [for colleges to figure out] which are right ones, and in the meantime the key is that [colleges] use measures that give them the information both to improve their offerings to students and to provide something meaningful for the American public."
She added: "That's what Miami Dade is doing, and I applaud them."