Let’s proclaim .300 as the target national completion rate for the nation’s 1,200 community colleges. A .300 batting average is fine for baseball, a bargaining point for an even higher salary. Why not for community college completion?
Howls about low completion rates are always in the news. No one will ever translate “too low” into a target reflecting the variables -- student preparation, student health, country of origin, campus funding, faculty workload or hours a student must work per week. I failed twice for this column in seeking a coherent comment from the leaders of the American Association of Community Colleges (AACC). First a referral to the excellent but aging Voluntary Framework for Accountability. Next a mumble about the inappropriateness of national comments or goals.
I disagree. Community colleges have no more important source of funding than federal Pell Grants. Acknowledging variances campus to campus, we still need a national target. Little causes me more distress than that it falls to me – one of the nation’s leading obscure columnists – to take a stand for these 8 million students, many of whom are on Pell Grants and food stamps. Until the experts offer a credible answer, I’ll say that community-college completion rate targets should align with baseball – a .300 average is pretty damn good.
Before writing here, I certainly benchmarked this hypothesis in the private sector, with a Yankees fan who’d be loath to certify any theories from Red Sox Nation.
“A measure of success is only meaningful if everyone understands and agrees on the degree of difficulty involved,” the global advertising titan Steve Gardner said from the New York offices of his firm, Gardner Nelson + Partners. “Arguably, the graduation rate at an elite college like Williams should be darn close to 100 percent. Williams is rigorous … but it starts with the crème de la crème of students and protects them in an environment with strong emotional support and financial resources. In contrast, the fact that the graduation rate at a community college may be 25 percent is inherently a meaningless statistic until it is shaped with context. What is fair to expect?”
After a winter and spring of deep analysis, I declare that baseball may be the only human enterprise with as many forbidding variables as teaching in a community college. A .300 batting average is fine for baseball. Why not for community college completion? Baseball is our national pastime. Community colleges bear the burden of educating the core of the nation’s work force. Why not start with batting averages?
Let me be the first to go on record, then, to state that a .300 completion average is not good enough -– for me or for my colleagues. All I seek for community colleges are completion-rate targets reflecting the difficulty of the job. With an awake electorate and national leadership behind me, I will commit to the completion targets below.
Current, We, The People Plan: Pell Grant cuts continue. Veterans flood into community colleges with no additional support for the colleges. No national leadership by or for community colleges.
.300 and falling
Federal free and reduced lunch and breakfast extended to college students on federal Pell Grants.
Requirement that federal Pell Grants must first be applied to achieving AP/college-level work in expository writing and in statistics.
The federal government pays for trained veteran counselors, one for every 50 veterans on a campus. Counselors will help with benefits, career advice, and medical management.
Equal federal subsidies, need-based, for all U.S. college students.
Federal subsidies at community colleges per student equal to subsidies at colleges such as Williams with indoor golf nets, faculty teaching 2-5 courses a year rather than 5 per semester, and the same Alice Waters inspired dining-hall food from the Yale Sustainability Project.
I’ve been sitting here for months, beside a baseball and The Science of Hitting, by the Boston Red Sox legend and lifetime .344 hitter Ted Williams (and John Underwood). “Hitting a baseball –- I’ve said it a thousand times -- is the single most difficult thing to do in sport,” is Williams’s opening in the book. After careful analysis over the past five years, I’ve concluded that community college teaching is at least as difficult as hitting a baseball.
Before hammering community colleges for low completion rates, keep in mind that the top lifetime batter was Ty Cobb with .366. Babe Ruth? .342. Lou Gehrig? .340. (From Baseball Almanac.) As I write, the won/lost percentage of the top Major League, team, the Yankees, is 0.614. In academia, that’s a D-.
Why .300? A few weeks before the end of last semester I was projecting how many students would finish the semester at my standard. (Many community college faculty members have such targets, in spite of reports that community colleges are afraid to be accountable.) Mine is a credible essay answering the Advanced Placement exam question on Abraham Lincoln’s Second Inaugural Address, “Write an essay in which you analyze the rhetorical strategies Lincoln uses to achieve his purpose.”
My M.B.A. HP 12-C calculator, still set to three decimal places, sent back a batting average, not a two-digit percentage: .272. That stinks. One more success, I discovered, would take me over .300, a batting average good enough to renew a Major League Baseball contact. I looked at the list of students. No guarantees, but I could see three I might be able to pull through before the end of the semester. Not all three, but one of the three even if I tried for all three. To .400? .500? 1.000? No way I could see, and, worse, nothing I could wish to have done differently.
I’ll declare, as I have before (my 2010 column “Last Year I Flunked Myself”), that I flunked again. Once again, I can end with no explanation for any percentages and enough heartwarming stories to delude myself that I’m making progress. This semester? A national magazine wants to publish my students’ versions of Walt Whitman’s “I Hear America Singing.” We, the people, can’t educate eight million students via anecdotes.
Why baseball batting? Williams includes a chart showing that a baseball can pass through the strike zone in 77 different places. I have no trouble seeing a Ted Williams chart worth of pitches headed at me when I step up before these students each day.
I am not going to trivialize students by naming them “Curve,” “Slider,” “Changeup,” “Forkball,” or “Sinker.” Consider the variety of pitches? Some students are high school graduates and some have GEDs. A Somalian explained at the start of one semester that the challenges impairing her high school experience included dodging snipers on the way to school and frequent raids on the school, machine guns firing, by rebels kidnapping future child soldiers.
The first languages that any class might pitch to me include Arabic (Moroccan, Egyptian, Syrian, Lebanese dialects), Armenian, Russian, Portuguese – via Brazil and Angola -- Spanish from every South and Central American country. Somali. French. Creole. Swahili.
Hunger is a more frequent pitch. These students may not have eaten that day. Last spring, two students had bosses who thought nothing of scheduling 8 a.m.- 4 p.m., 4 p.m.- midnight, and midnight to 8 a.m. shifts all during one week. One semester, I had a veteran who vanished (later found) after two more buddies from his unit committed suicide. Once, a student was shot and murdered. Last spring was the first semester in a while where no one in the class reported anyone shot in their family.
I don’t understand why others won’t make this proposal? The president, the board chair and the board chair-elect of the American Association of Community Colleges ducked the question twice as I worked on this column. Here in Boston, a foundation in January issued a lightly researcher, predictable report with all the usual comments on low completion rates and failure to meet work force needs and no specific success targets. The distressing news eight months later is that in spite of at least one heroic effort I know of, no one, including the Massachusetts community colleges, have put a plan or a reply on the table.
My grade for last semester remains “F.” My batting average I don’t know yet. Still working in extra innings with two to complete the course.
Wick Sloane writes the Devil's Workshop column for Inside Higher Ed.
Two California community colleges are ahead of City College of San Francisco in coping with accreditation threat. Special trustees or a takeover could loom, while accreditor warns CCSF faculty about misleading statements.
The author and philosopher, Thomas Merton, once said that “the self-fulfilling prophecy perpetuates a reign of error.” The self-fulfilling prophecy that remedial education has failed now leads us to such a reign of error. The news media, policy makers and various higher education agencies are using flawed interpretations of data about remediation to make unsupported assertions and repeat them frequently, thus leading to erroneous policy decisions.
It began with a 2007 report by Mattorell and McFarlin. Supported by the Rand Corporation, these researchers used a regression-discontinuity design to study a large sample of Texas college students whose scores placed them just above and just below the placement level for remedial courses in mathematics and reading. They found that those who just missed the cut score and placed into remedial courses did no better in college-level classes, graduation rates, transfer rates and earnings than those students who just made the cut score. This finding was used to support the authors’ conclusion that remediation was of questionable value.
There is a major flaw in this conclusion. The authors assume that participation in remedial courses should result in participants performing better than students who did not take them. The purpose of remedial courses, however, is to level the academic playing field for underprepared students, not to enable them to outperform prepared students. Given that, the fact that there is little difference in performance between the two groups would indicate that the purpose of remedial courses had been accomplished.
A similar study using a regression discontinuity design was conducted by Calcagno and Long (2008) with similar results. Students in this study who scored just below the cut score and participated in remediation did no better in the corresponding college-level course than those who scored slightly higher and bypassed remediation. Calcagno and Long were more tentative about the meaning of their findings than Martorell and McFarlin. They pointed out that “[t]he results of this study suggest that remediation has both benefits and drawbacks as a strategy to address the needs of underprepared students.” They also admit that “the research design we used only allows the identification of the effect of remediation on a subset of students who scored just above and just below the cutoff score. Estimates should not be extrapolated to students with academic skills so weak that they scored significantly below the cutoff point”
Neither study explored the performance of students with lower assessment test scores in later college-level classes. And neither study is generalizable to the entire range of remedial courses and students. Yet these are the major studies used to justify the claim that all remediation has failed. Although none of the authors of these studies makes this claim, their work is used to justify it.
Meanwhile, there are other studies of remediation leading to different conclusions. In a 2006 study using the 1988 National Educational Longitudinal database, Attewell, Lavin, Domina and Levey found that “two-year college students who successfully passed remedial courses were more likely to graduate than equivalent students who never took remediation.” They also found that students who took remedial reading or writing were more likely to graduate from community colleges than students who did not take these courses. Bahr then later studied the effect of remediation on a large sample of students at 107 California community colleges. He found that students who successfully completed math remediation were just as successful in college-level math courses as those who did not require remediation. He concluded that “these two groups of students experience comparable outcomes, which indicates that remedial math programs are highly effective at resolving skill deficiencies.”
Boatman and Long, in a 2010 study, reviewed a sample of 12,200 students enrolled at public institutions in Tennessee. Although they found many negative effects for remediation, they also found some positive effects depending upon the subject area and the degree of student underpreparedness. Among their conclusions were that postsecondary decision makers “not treat remediation as a singular policy but instead consider it as an intervention that might vary in its impact according to student needs.”
If we look at all the major studies of remediation, we find conflicting findings and inconclusive results. Given these findings, it is difficult to understand how any credible scholar familiar with the available evidence can decisively conclude that remediation has failed. Nevertheless, there are those who misinterpret or ignore the available evidence to make this claim and are then widely quoted by others. It does not take long for the press and policymakers to echo these quotes. In fact, there is little data to justify this assertion and the studies on which the assertion is based do not support it.
A prime example showing the perpetuation of the reign of error can be found in a recent report from Complete College America. Entitled “Remediation: Higher Education’s Bridge to Nowhere,” the report contends at the outset that the “current remediation system is broken” and that “remediation doesn’t work.” As evidence for this assertion, the report claims “research shows that students who skip their remedial assignments do just as well in gateway courses as those who took remediation first.” This erroneously suggests that all remediation has failed. Because the authors do not bother to cite a reference for this claim, we can only assume that they are using the Martorell and McFarlin, Boatman and Long, and the Calcagno and Long studies to justify it. As previously noted, this is not what either publication actually says.
The “Bridge to Nowhere” report goes on to project that, according to their data, only 9.5 percent of those who take remedial courses will graduate from a community college within three years while 13.9 percent of those who do not take remedial courses will graduate within three years. The authors cite these figures as further evidence of the failure of remediation. We do not disagree with these figures but we do disagree with the interpretation of them. The authors of “Bridge to Nowhere” appear to be arguing that it is participation in remediation that accounts for this difference in graduation rates. This argument is based on the assumption that correlation implies causality, a well-known fallacy among researchers. Furthermore, as Bettinger and Long in a 2005 study point out, “a simple comparison of students in and out of remediation is likely to suggest negative effects due to important academic differences in the underlying populations.” Students placing into remediation are disproportionately characterized by known risk factors such as being minority, low income, first generation and underprepared. For such students it is likely that these factors account more for low graduation rates than participation in remediation.
We do not argue with the data provided in any of these reports, nor do we question the methodology used in obtaining the data. We also concur with many of the recommendations in “Bridge to Nowhere.” We would further agree that remediation as currently delivered in U.S. community colleges is very much in need of reform. However, we disagree that all remediation has failed everywhere for all students as many policymakers and news reporters seem to believe. There is simply no credible scientific evidence to support this belief. Unfortunately, that does not stop organizations like Complete College America and others from asserting that remediation has failed, thus creating a self-fulfilling prophecy.
A more recent example took place in the Connecticut Legislature this year. Based on the misinterpretations of research and the fallacious arguments discussed here, the Legislature required that remediation be limited to one semester in all its colleges and replaced by embedded support services in gateway courses. This also happens to be one of the major recommendations of Complete College America. Although we agree that this might be a good solution for some students, particularly for those at the top of the remediation score distribution, it is not a good solution for all students. In fact, there is no single solution for all underprepared college students. There are many tools validated by varying amounts of research available to address the needs of underprepared college students through improved remedial courses and a variety of separate or embedded support services.
Neither colleges and universities nor policymakers, then, should conclude that all remediation has failed and engage in knee-jerk attempts to eliminate it. We need to reform remediation and guide our reform efforts with accurate data and sound research. We need to explore various alternatives, including some of those proposed by Complete College America and others. Nonetheless, we disagree that eliminating all remedial courses is a wise course of action. As Bahr points out, remediation “is so fundamental to the activities of the community college that significant alterations to remedial programs would change drastically the educational geography of these institutions. This should give pause to those who advocate the elimination of remediation … as the implication of such changes would be profound indeed.” We believe that arguing for such profound change as the elimination of remediation for all students on the basis of so little evidence is not only ill-advised but will also undermine the goal of improving college completion.
Hunter R. Boylan is the director of the National Center for Developmental Education and a professor of higher education at Appalachian State University. Alexandros Goudas is an English instructor at Delta Community College.
Statewide reverse transfer agreements, in which four-year colleges grant associate degrees to students who transfer from community colleges, are spreading. The process isn't easy, but could help students and graduation rates.