Graduation rates

Report on employment of college grads is public relations, not research (essay)

Our problem with the new report The College Advantage: Weathering the Economic Storm, on the employment of university graduates since the start of the Great Recession, begins even before the first word of text. In the first paragraph of the acknowledgments, speaking of those who financed the study, the Lumina and Gates Foundations, the authors -- Anthony Carnevale and associates at Georgetown University -- observe, "We are honored to be partners in the mission of promoting postsecondary access and completion for all Americans."

Thus this report is about promoting a mission, a policy position, not about achieving a dispassionate, objective and complete analysis of the evidence. It is thus better viewed as a piece of PR, agitprop musings as it were, not a serious academic study.  Certainly, we doubt any peer-reviewed reputable academic journal in economics would touch this study in its current form.

This brings up a bigger problem: isn't there an inherent, huge conflict of interest in university researchers issuing reports favoring positions that are in their own self-interest? Is it not true that Georgetown and other universities gain marketing advantages (and maybe higher tuition fees) by promoting the idea that “it pays to go to college”? The subjective bias is further revealed as the authors at the very beginning decry "attacks" by higher education "cost-cutters,” as if trying to improve efficiency in a low productivity industry is somehow bad.

Getting to the evidence, the Georgetown team is probably correct, that, on average, college graduates fared better in labor markets in the Great Recession and its slow recovery than did those with lesser degrees or diplomas. But where are the control variables accounting for the fact that college graduates are, on average, brighter, more disciplined, and more ambitious than those with less education? A typical high school graduate is a less desirable employee than a typical college graduate for reasons independent of the formal postsecondary education acquired.

Moreover, while the members of the Carnevale team agree that those working only part-time jobs are not truly “employed,” they draw no such distinction with those trained for relatively highly skilled work now doing menial labor. College biology graduates driving taxi cabs are considered fully employed by the definitions that are used. Yet in a real economic sense, they are underemployed or mal-employed, and their human capital utilization is well below what the expectations of both the worker and arguably society as a whole.

“Employment” is not in any meaningful economic context a simple binary variable like pregnancy (you are, or you are not), but a continuum reflecting variations in both hours worked and the meaningfulness of the labor performed. Our guess, based on looking at other labor market data, is that “human capital utilization” among college graduates has fallen a fair amount more than “employment” in recent years, as college graduates increasingly take low-paying (reflecting low productivity) jobs. According to the most recent report by a Northeastern University professor for the Associated Press, using Current Population Survey data, roughly 53 percent of recent college graduates are underemployed, instead of the 8 percent reported in the Carnevale report.

All of this suggests that data are subject to an altogether different interpretation than used by the Georgetown team. Consider, for example, the argument advanced by the Georgetown group that "Even in traditionally blue-collar industries, better educated workers fared better." To us, that basically says overqualified people with college degrees appear to be crowding out others in the market for low-skilled jobs, showing that the "underemployment" problem amongst college graduates is considerable.

Similarly, the authors are lumping together those with a high school diploma and those with less than a high school diploma in the statistical comparisons -- so the analysis differs from the traditional comparisons of high school- and college-educated individuals.

Those who did not graduate from high school make up 24 percent of the sample for the “High school or less” category and 33 percent of the unemployed according to the May 2012 Bureau of Labor Statistics data.  Those who attained only a high school diploma are much more likely to still have a job (8 percent unemployment) than are those with less than a high school education (13 percent unemployment).

But what is worse, the authors fail to seriously do what even undergraduate economics students writing papers would be expected to do -- relate costs to benefits. Suppose, even after controlling for everything under the sun, college graduates have a clear employment security advantage during turbulent economic times -- a conclusion that we acknowledge is plausible, maybe even expected, if colleges do what they claim to do.

Is the value of that job security advantage big enough to offset the costs of college attendance, where "costs" include not only cash outlays by students, and the income foregone while studying rather than working, but also the total cost to society from the various government subsidies associated with a college education, and the high risks related to the fact that a majority of college students either do not graduate at all, or fail to do so in the advertised (four-year) time needed to complete the degree?

All of that aside, however -- a huge "aside" -- there are hints in the data that the college advantage is becoming frayed. Look, for example, at Figure 2 in the executive summary, which seems to show that the college degree earnings advantage (to us a vastly overused and flawed statistic) peaked around 2005 or so and has declined modestly since. From 2008 and 2010, Census Bureau data show real earnings fell a good deal for full-time male college-educated workers, unlike, for example, those with less than a high school education. On September 7 the Census Bureau will release 2011 data which will give further indication whether the most recent data are the beginning of a longer-term trend.

Our reading of the evidence is that truly dispassionate examination of the data by those without any vested interest in the conclusions, controlling for other factors involved in determining unemployment and earnings, might well yield a radically different conclusion than found in this public relations effort of Tony Carnevale and his team at Georgetown. The assumptions of this report – that those with little education dramatically improve their job security by deciding to go to college -- are certainly not adequately demonstrated.

Richard Vedder directs the Center for College Affordability and Productivity, teaches economics at Ohio University, and is an adjunct scholar at the American Enterprise Institute. Daniel Garrett is an honors undergraduate economics major at Ohio University.

Podcast discussion on value of a college degree

Smart Title: 

In interview, Anthony Carnevale and Lumina Foundation's Jamie Merisotis assess the value of college degree, the emergence of new credentials, and whether rebounding male enrollments will last.

Departures and a reorganization at federal data division stir fears

Smart Title: 

Departures and a reorganization at the National Center for Education Statistics stir fears the feds are less focused on higher ed data. Not so, says the center's commissioner.

WGU pushes transfer students to graduate community college first

Smart Title: 

Western Governors U. pushes graduation even before students enroll by offering financial perks for associate degree holders and, at WGU Texas, through partnerships with community colleges.

For-profits take brunt of new Cal Grant cuts

Smart Title: 

California's new financial aid cuts are aimed squarely at career colleges, another sign that the battle over for-profit regulation may be moving to the states.

Essay on flawed interpretations of research on remedial education

The author and philosopher, Thomas Merton, once said that “the self-fulfilling prophecy perpetuates a reign of error.” The self-fulfilling prophecy that remedial education has failed now leads us to such a reign of error. The news media, policy makers and various higher education agencies are using flawed interpretations of data about remediation to make unsupported assertions and repeat them frequently, thus leading to erroneous policy decisions.

It began with a 2007 report by Mattorell and McFarlin. Supported by the Rand Corporation, these researchers used a regression-discontinuity design to study a large sample of Texas college students whose scores placed them just above and just below the placement level for remedial courses in mathematics and reading. They found that those who just missed the cut score and placed into remedial courses did no better in college-level classes, graduation rates, transfer rates and earnings than those students who just made the cut score. This finding was used to support the authors’ conclusion that remediation was of questionable value.

There is a major flaw in this conclusion. The authors assume that participation in remedial courses should result in participants performing better than students who did not take them. The purpose of remedial courses, however, is to level the academic playing field for underprepared students, not to enable them to outperform prepared students. Given that, the fact that there is little difference in performance between the two groups would indicate that the purpose of remedial courses had been accomplished.    

A similar study using a regression discontinuity design was conducted by Calcagno and Long (2008) with similar results. Students in this study who scored just below the cut score and participated in remediation did no better in the corresponding college-level course than those who scored slightly higher and bypassed remediation. Calcagno and Long were more tentative about the meaning of their findings than Martorell and McFarlin. They pointed out that “[t]he results of this study suggest that remediation has both benefits and drawbacks as a strategy to address the needs of underprepared students.” They also admit that “the research design we used only allows the identification of the effect of remediation on a subset of students who scored just above and just below the cutoff score. Estimates should not be extrapolated to students with academic skills so weak that they scored significantly below the cutoff point”

Neither study explored the performance of students with lower assessment test scores in later college-level classes. And neither study is generalizable to the entire range of remedial courses and students. Yet these are the major studies used to justify the claim that all remediation has failed. Although none of the authors of these studies makes this claim, their work is used to justify it. 

Meanwhile, there are other studies of remediation leading to different conclusions. In a 2006 study using the 1988 National Educational Longitudinal database, Attewell, Lavin, Domina and Levey found that “two-year college students who successfully passed remedial courses were more likely to graduate than equivalent students who never took remediation.” They also found that students who took remedial reading or writing were more likely to graduate from community colleges than students who did not take these courses. Bahr then later studied the effect of remediation on a large sample of students at 107 California community colleges. He found that students who successfully completed math remediation were just as successful in college-level math courses as those who did not require remediation. He concluded that “these two groups of students experience comparable outcomes, which indicates that remedial math programs are highly effective at resolving skill deficiencies.”

Boatman and Long, in a 2010 study, reviewed a sample of 12,200 students enrolled at public institutions in Tennessee. Although they found many negative effects for remediation, they also found some positive effects depending upon the subject area and the degree of student underpreparedness. Among their conclusions were that postsecondary decision makers “not treat remediation as a singular policy but instead consider it as an intervention that might vary in its impact according to student needs.”

If we look at all the major studies of remediation, we find conflicting findings and inconclusive results. Given these findings, it is difficult to understand how any credible scholar familiar with the available evidence can decisively conclude that remediation has failed. Nevertheless, there are those who misinterpret or ignore the available evidence to make this claim and are then widely quoted by others. It does not take long for the press and policymakers to echo these quotes. In fact, there is little data to justify this assertion and the studies on which the assertion is based do not support it.

A prime example showing the perpetuation of the reign of error can be found in a recent report from Complete College America.  Entitled “Remediation: Higher Education’s Bridge to Nowhere,” the report contends at the outset that the “current remediation system is broken” and that “remediation doesn’t work.” As evidence for this assertion, the report claims “research shows that students who skip their remedial assignments do just as well in gateway courses as those who took remediation first.” This erroneously suggests that all remediation has failed. Because the authors do not bother to cite a reference for this claim, we can only assume that they are using the Martorell and McFarlin, Boatman and Long, and the Calcagno and Long studies to justify it. As previously noted, this is not what either publication actually says. 

The “Bridge to Nowhere” report goes on to project that, according to their data, only 9.5 percent of those who take remedial courses will graduate from a community college within three years while 13.9 percent of those who do not take remedial courses will graduate within three years. The authors cite these figures as further evidence of the failure of remediation. We do not disagree with these figures but we do disagree with the interpretation of them. The authors of “Bridge to Nowhere” appear to be arguing that it is participation in remediation that accounts for this difference in graduation rates. This argument is based on the assumption that correlation implies causality, a well-known fallacy among researchers. Furthermore, as Bettinger and Long in a 2005 study point out, “a simple comparison of students in and out of remediation is likely to suggest negative effects due to important academic differences in the underlying populations.” Students placing into remediation are disproportionately characterized by known risk factors such as being minority, low income, first generation and underprepared. For such students it is likely that these factors account more for low graduation rates than participation in remediation.

We do not argue with the data provided in any of these reports, nor do we question the methodology used in obtaining the data. We also concur with many of the recommendations in “Bridge to Nowhere.” We would further agree that remediation as currently delivered in U.S. community colleges is very much in need of reform. However, we disagree that all remediation has failed everywhere for all students as many policymakers and news reporters seem to believe. There is simply no credible scientific evidence to support this belief. Unfortunately, that does not stop organizations like Complete College America and others from asserting that remediation has failed, thus creating a self-fulfilling prophecy.

A more recent example took place in the Connecticut Legislature this year. Based on the misinterpretations of research and the fallacious arguments discussed here, the Legislature required that remediation be limited to one semester in all its colleges and replaced by embedded support services in gateway courses. This also happens to be one of the major recommendations of Complete College America. Although we agree that this might be a good solution for some students, particularly for those at the top of the remediation score distribution, it is not a good solution for all students. In fact, there is no single solution for all underprepared college students. There are many tools validated by varying amounts of research available to address the needs of underprepared college students through improved remedial courses and a variety of separate or embedded support services.

Neither colleges and universities nor policymakers, then, should conclude that all remediation has failed and engage in knee-jerk attempts to eliminate it. We need to reform remediation and guide our reform efforts with accurate data and sound research. We need to explore various alternatives, including some of those proposed by Complete College America and others. Nonetheless, we disagree that eliminating all remedial courses is a wise course of action. As Bahr points out, remediation “is so fundamental to the activities of the community college that significant alterations to remedial programs would change drastically the educational geography of these institutions. This should give pause to those who advocate the elimination of remediation … as the implication of such changes would be profound indeed.” We believe that arguing for such profound change as the elimination of remediation for all students on the basis of so little evidence is not only ill-advised but will also undermine the goal of improving college completion.

Hunter R. Boylan is the director of the National Center for Developmental Education and a professor of higher education at Appalachian State University. Alexandros Goudas is an English instructor at Delta Community College.

Taking stock of the completion agenda's benefits, and limits

Smart Title: 

Community college leaders say "completion agenda" has been good for the sector, even when painful, but they worry that the focus could have unintended consequences if it becomes a fixation.

 

Statewide reverse transfer catches on, could boost graduation rates

Smart Title: 

Statewide reverse transfer agreements, in which four-year colleges grant associate degrees to students who transfer from community colleges, are spreading. The process isn't easy, but could help students and graduation rates.

Community college association calls for change from within

Smart Title: 

Major report from community college association sets broad goals for the sector, but is couched in familiar terms.

Education Department changing graduation rate measurements

Smart Title: 

The Education Department plans to change widely disparaged federal definition of completion rate to include transfers and nontraditional students.

Pages

Subscribe to RSS - Graduation rates
Back to Top