You have /5 articles left.
Sign up for a free account or log in.
In co-requisite remediation, students who have been assessed as not yet ready for college work receive extra help while they take a college-level course instead of receiving a traditional, prerequisite remedial (developmental) course in mathematics, reading or writing. Evidence of the greater effectiveness of co-requisite remediation, as compared to traditional remediation, has been steadily accumulating.
Yet some people say that evidence does not exist. Here I give some of the statements that I have heard about the inadequacy of evidence supporting co-requisite remediation, describe some of the extensive evidence that we do have and provide some of the reasons that the evidence isn’t better known.
First, some background on co-requisite and traditional remediation. The theory behind traditional remediation is that some students do not have college-level skills, and that taking and passing a remedial course will put such students on an equal footing with other students when they all take college-level courses together. But traditional remediation usually does not have positive results.
Currently, around 68 percent of new college freshmen in public community colleges and 40 percent in public four-year colleges take at least one remedial course in reading, writing or mathematics (somewhat more often in math), but most students assigned to remediation either never take a course or don’t complete it. In fact, traditional mathematics remediation has been called the largest single academic block to college student success. And if students cannot pass remedial courses, people have asked, how could they pass college-level courses, even with extra help? In other words, how could they pass co-requisite remedial courses?
Study after study, however, has shown higher course pass rates in co-requisite remedial courses than in traditional remedial courses, including with college-level courses in chemistry, multiple examples of mathematics, reading, sociology, and also writing.
But, people have said, those results don’t prove that co-requisite remediation is better. They were not controlled experiments. The students in the co-requisite courses and/or the faculty teaching them may not have been the same as in the traditional remedial courses. Students who are somehow better might have been selected, or might have selected themselves, into the co-requisite courses, and better faculty members might have taught the co-requisite courses.
What Our Research Found
So we conducted a randomized controlled trial of co-requisite math remediation. We randomly assigned 907 community college students at the City University of New York, all of whom had been assessed as needing remedial elementary algebra and who did not need college algebra for their intended majors, to one of three course types: traditional remedial elementary algebra; the same course plus a weekly workshop; or introductory college-level statistics with a weekly workshop (co-requisite remediation). The workshops, two hours per week and taught by advanced undergraduates, covered topics that the students found difficult in the college-level course, including, for the statistics students, any algebra that the students needed in order to understand the statistics. Each instructor taught one section of each course type, so that any course pass-rate differences could not be due to differences in the faculty members teaching the different course types.
Let’s focus on comparing the first (traditional remediation) and third (co-requisite remediation) course types. Of students who actually took the courses to which they were assigned, 39 percent passed traditional remediation, a typical result, but 56 percent passed statistics with co-requisite remediation, less than the typical pass rate for statistics but significantly more than the pass rate for traditional remediation. Further, students with relatively low as well as relatively high placement test scores got about the same bump up in probability of passing from being assigned to co-requisite statistics rather than traditional remediation. There was also evidence that the statistics students were motivated to work harder than the other students. For example, the statistics students were more likely to report forming their own study groups.
But, people said, the statistics students only passed at a higher rate than the elementary algebra students because, although the same faculty members taught both course types, they graded the statistics students more easily. Given that, and given the statistics students never had all of the elementary algebra that they were supposed to have, the statistics students won’t be able to pass other courses.
As described in the published paper about this research, much evidence indicates that the faculty members graded those statistics students the same way that they always graded statistics students. Further, in the year following the end of the experiment, not including statistics courses, the elementary algebra students accumulated, on average, 16 credits, but the statistics students accumulated 19.
In addition to presenting this research at numerous academic conferences, we first published it online in a refereed journal (Educational Evaluation and Policy Analysis) in July 2016. And in March 2018, it met the rigorous standards necessary to be accepted into the federal What Works Clearinghouse “with no reservations.”
Nevertheless, people said the students who were randomly assigned to statistics with co-requisite support would not continue to do well because they didn’t have the elementary algebra knowledge that is needed to succeed in their natural and social science general education courses.
So, as Daniel Douglas, Mari Watanabe-Rose and I presented at the 2018 CADE conference, we followed the academic progress of all of the students in the experiment for three years after the experiment. During that time, the co-requisite statistics students passed at least as many of their general education courses as did the traditional remediation students. For example, 35 percent of the traditional remediation students satisfied CUNY’s Life and Physical Science general education category, compared to 41 percent of the statistics students. In addition, on average, it took the traditional remediation students a total of 5.2 remedial and college-level quantitative course enrollments to pass their general education Mathematical and Quantitative Reasoning requirement, but it took the statistics students only 2.6 -- a large educational cost savings.
People said, however, that the students who were randomly assigned to statistics would not take and pass the traditional calculus-track math courses that need elementary algebra and college algebra as prerequisites. By assigning students to statistics, they said, we were forever preventing them from deciding to take the algebra-calculus route to STEM and other math-intensive majors. Not so with the traditional remediation students.
But 14 students passed their assigned statistics course during the experiment and later passed college algebra without ever having taken elementary algebra. In addition, an examination of all the math courses taken by all the students during the three years since the experiment revealed that the statistics students had taken and passed a greater number of advanced math courses than had the traditional remediation students: 39 total such courses for the statistics group, and 27 total such courses for the traditional remediation group. Finally, the same small number of students (two) from each group took the course to which they were assigned for the experiment and graduated within three years with a math-intensive major.
Nevertheless, people said, changing one course requirement won’t increase graduation rates. Many interventions can increase student success, but the positive effects dissipate over time.
However, in the three years since the experiment, while 17 percent of the traditional remediation students received an associate’s degree from CUNY or another college, 25 percent of the statistics students did so. That is, during the three-year period, close to 50 percent more statistics students graduated in comparison to the traditional remediation students. Even when we compared just statistics students who received a D to traditional remediation students who received any passing grade, the graduation rate for the former was 41 percent compared to 28 percent for the latter.
We also showed that the course success and graduation rate results do not differ according to students’ race/ethnicity. Given that students from underrepresented groups are more likely to be assigned to remediation, our results mean that our co-requisite model can help close gaps in graduation rates between underrepresented and other students, thus addressing the characterization of traditional remediation as a civil rights issue.
But, as one math department chair said to me, how do the statistics students do after they graduate? Not having taken and passed elementary algebra must be harmful to their successful postgraduation employment.
I confess that we do not yet have postgraduation employment data for the students who were in our experiment. Recent research indicates, however, that algebra is not needed for the great majority of jobs. In contrast, taking statistics in college helps increase women’s salaries.
In sum, a great deal of evidence now shows the advantages of college students taking statistics, rather than remedial algebra, and greater student success with co-requisite, as opposed to traditional, remediation. As a result, many states -- for example the California community college and state university systems -- have been mandating the implementation of co-requisite remediation.
Faculty Resistance
As with many new initiatives, we’ve seen faculty resistance to co-requisite remediation, and at least some of that resistance has been based on faculty claiming that little or no evidence supports co-requisite remediation. How can that be?
First, math and English faculty members are unlikely to read education research journals or look at the federal What Works Clearinghouse. In addition, other sorts of publications, including some that math and English faculty members would read, have not always been clear that evidence supporting remediation does, in fact, exist. Let us consider just three publications that all appeared in February of this year -- all published by reputable nonprofit research organizations or in reputable higher education media, including two publications by organizations that have their own randomized controlled trials of co-requisite remediation in process.
The first of the three stated, “Rigorous research evidence on the effectiveness of co-requisites is limited to the studies of the [co-requisite freshman writing course] model … [research has] focused on the impact of co-requisites for students who were close to being college-ready, yet states and institutions have rolled out policies that target co-requisites to students with lower levels of incoming college-readiness. It is unclear whether these new groups of students will benefit … there is little information on other student characteristics associated with success in co-requisites.”
The second publication, writing about the first, said, “Policy makers should not push colleges to put thousands of struggling students through a new ill-defined co-requisite model before we know if it works and, if it does, for which students.”
And the third said, “The field would benefit from more rigorous evidence about the effectiveness of alternative instructional strategies.” Someone could read this ambiguous sentence as stating that no truly rigorous evidence yet supports co-requisite remediation. Rather than speculate about the possible reasons that these particular statements were written, of which there are many, the important point is that faculty members, as well as administrators and policy makers, do see such publications and are influenced by them.
The evidence clearly converges on co-requisite remediation being more effective than traditional remediation. There are many possible explanations for this, including the incorrect assignment of some students to remediation, the demotivating effect of being assigned to traditional remediation, the extra time and cost to students if they must take traditional remedial courses, the greater number of potential exit points from traditional remediation course sequences, and so on. We can certainly use more information about what specific aspects of co-requisite remediation make it most effective, and for which students. But that is a reason for more research -- not cause to favor something worse instead.
The higher education community has a responsibility to spread this message far and wide: co-requisite remediation increases student success.