A comprehensive evaluation of Achieving the Dream, a nationwide program designed to boost the academic success of community college students, reveals that overall trends in student outcomes at the first 26 institutions to join the project remained relatively unchanged after five years, even though the colleges adopted a wide range of strategies to improve them.
MDRC, a nonprofit education and social policy research organization, and the Community College Research Center at Columbia University's Teachers College released the report Wednesday at an Achieving the Dream meeting in Indianapolis. Though the external evaluation is critical of the project, noting especially the institutions' inability to involve enough students in the reforms, Achieving the Dream officials responded by saying that they are “encouraged” by and “very proud” of the progress the participating institutions have made so far. Still, they admit that “there is much more to be done.”
The report covers a five-year period, beginning in the 2004-5 academic year, and evaluates the work of the first community colleges to join the initiative, from Florida, New Mexico, North Carolina, Texas and Virginia. These 26 institutions are diverse in size, location and student characteristics. The largest is Houston Community College, which had a full-time equivalent 12-month enrollment of more than 32,000 in 2008-9. The smallest is Martin Community College, in North Carolina, which had a full-time equivalent enrollment of 410 students.
The approach of the project was to help the community colleges build a “culture of evidence” by using student data to identify barriers to academic progress. Then, the institutions were expected to develop strategies designed to improve student outcomes, conduct further research on their success, and bring helpful programs to scale. On the whole, the Lumina Foundation for Education has spent about $76 million on the Achieving the Dream project since its inception in 2004.
The ultimate goal of the project was for participating institutions to “move the needle” on five measures of student success: “completion of developmental courses and progress to credit-bearing courses"; “completion of so-called gatekeeper courses, including introductory college courses in English and math"; “completion of attempted courses with a grade of ‘C’ or better"; “persistence from semester to semester and year to year"; and “attainment of [a] college credential.”
Yet according to the report, most of the measures of student success for the overall populations at the institutions did not change in a statistically significant way after five years, less because programs used were unsuccessful — many individual efforts have been publicly lauded as successful — than because they touched too few students. The report hypothesizes that “colleges often faced a trade-off between the intensity and scale of their interventions.” The small-scale strategies they adopted, it argues, probably reached too few students “to make demonstrable progress on improving student achievement over all.”
Examples of the modest gains the project achieved: the average percentage of developmental students at these institutions who completed their remedial sequence within the first two years grew from 21 percent to 25.2 percent; the average number of completed credits by an individual student at these institutions within the first two years increased only from 22 to 24.5; and the overall average fall-to-fall persistence rate at these institutions rose from 46.2 percent to 49.5 percent.
Though the report reveals that “there was some tension” between those who thought improvement in student outcomes would come quickly and those who thought they would take time, Achieving the Dream officials indicated at the start of the project that participating colleges should see “measurable improvement in success rates” after four years and have “achieved their long-term targets for student success” after eight years. Still, the project had no solid benchmarks for intermediary success. And though there were changes in these measures during the five-year span, few of them were considered statistically significant.
Most Achieving the Dream community colleges made progress in developing “more evidence-based systems aimed at improving student success” in their first five years of participating. The report rates 11 colleges as having a “strong culture of evidence,” 10 colleges as having “some culture of evidence” and 5 as having a “weak culture of evidence.”
Thomas Brock, a co-author of the report and director of the young adults and postsecondary education policy area at MDRC, said he would not reveal the individual ratings for the 26 colleges. He noted that the authors made an agreement with the participants that they “wouldn’t call out colleges for exemplary or poor performance.”
Still, Brock explained that those institutions that had a hard time collecting the required project data had either “weak institutional research capacity” or “fundamental problems with their IT systems.” He added that rural community colleges were more likely to have these problems. Some officials anonymously quoted in the report described Achieving the Dream data collection as “frustrating,” “demeaning” and “burdensome.”
On average, each college implemented seven strategies, with all 26 colleges implementing more than 200 strategies in total. A majority of the strategies were designed to boost “academic and social support systems for students” outside of the classroom, while only about a fourth of them actually “changed the content and delivery of classroom instruction itself.” About half of the efforts targeted remedial students, and about a third focused on students in their first year of college. “Very few” of them targeted students based on their race, ethnicity or economic status.
The impact of the Achieving the Dream efforts at these colleges was extremely limited, though. The report notes that most strategies reached less than 10 percent of their intended target populations. It specifies that curricular reforms or intensive advising were “unlikely to reach large numbers of students,” while student success courses were the “sole high-intensity strategy to reach a large number of students at a majority of colleges.”
Ultimately, the report even questions whether Achieving the Dream institutions “would be better served if they aimed to implement fewer strategies.” To that end, a community college president anonymously quoted in the report mused, “Part of the reason we have not accomplished as much as we could is that from the beginning, we set out to do too many things.”
Suggestions for Improvement
The report offers several ways in which Achieving the Dream may “refine its approach to make a stronger impact on students’ success.” Chiefly, the report suggests that participating institutions seek more involvement from faculty and staff members. “Given the primary role that faculty and staff play in teaching and supporting students’ learning, the initiative should focus more attention on directly engaging these personnel as leaders in the colleges’ reform process,” reads the report, asserting that many participating institutions have not made enough effort to involve adjunct faculty in their efforts. This is especially important, the report argues, since many of these faculty members teach remedial coursework that enrolls high-risk students.
In a similar vein, the report argues that participating institutions should change the way they measure student success.
“The initiative might also seek to incorporate classroom-based measures of learning, which more clearly document students’ attainment of particular skills and practices, in its model for institutional improvement,” the report reads. “Such measures might help bridge the gap in Achieving the Dream’s theory of action, which currently focuses on broad institutional changes in student outcomes that may take many years to manifest.”
Responding to the Critique
Achieving the Dream officials took the good with the bad Wednesday when reviewing the first major evaluation of their work. Ultimately, though, they were optimistic about the future of the initiative, lauding their charter institutions for having the “courage” to take on what was, at the beginning, a unique mission: focusing more on completion than on access.
“There is good news in here,” said Katie Loovis, a spokeswoman for Achieving the Dream. “There is this growing culture of evidence out there. But, it’s like turning a 700-ton ship. It takes a while to capture the minds and hearts of the leadership of the colleges.”
Loovis added that Achieving the Dream’s leaders will consider suggestions from the report.
“As we tell our colleges, the first step is looking at data,” Loovis said. “And sometimes the data is going to tell you some discouraging news — that graduation rates and persistence levels are not what we hoped them to be, especially for low-income students and students of color. But, ultimately, this report is very important for us. We welcome the findings. We wouldn’t be practicing what we preach if we didn’t.”
From the outside looking in, Brock had a similar analysis. But, he also spoke of the intangibles, such as the influence Achieving the Dream has had on the community college sector, which often cannot be measured.
“Don’t discount the progress this has had,” Brock said. “Also, don’t doubt the difficulty of getting something like this off the ground. This is a good foundation. Achieving the Dream has been a catalyst for good things, but the work needs to continue.”