You have /5 articles left.
Sign up for a free account or log in.

Every summer, the “Are college prices getting out of control?” debate gets a boost as colleges and universities set their tuition and fees for the upcoming academic year. Thanks in large part to Congress running the student loan interest rate debate right up until the eleventh hour, we’ve also been the beneficiary of a prolonged social media campaign – complete with statistics, graphs and charts – that has soberly reminded us both how expensive, and what a gamble, getting a college education can be today.

Toss in a pinch of questions about how much college students are really learning, add a dash of persistent high unemployment, mix generously with a presidential election cycle, and the result is a near state of panic with widespread calls for “disruptive” change to one of the largest sectors of nation’s economy.

Enter the massive open online course (MOOC) craze. Born at two of the nation’s most elite colleges, MOOCs have received an unbelievable amount of news coverage for offering the potential to solve one of the sector's most nagging problems: how to provide world-class education for practically no consumer cost. The courses provided via MITx and by a handful of Stanford professors have generated considerable publicity, though it’s the recent announcement that Coursera (another Stanford spin-off) has lined up around a dozen elite institutions that will use their platform to offer similarly styled educational offerings that really has folks thinking MOOCs may very well be the answer to our system’s perceived ills.

Call it Nature’s sense of humor, but evolution has a habit of creating some very odd, yet revolutionary offshoots in its grand competition of, “let the best man win.” While it’s too early in the game to know whether MOOCs are the evolutionary equivalent of modern day humans or Neanderthals, my hunch is that it’s the latter, as the MOOC in its current form is encoded with a fatal economic flaw that makes it all but certain to fail.

The overwhelming majority of college-goers today don’t enroll in higher education to get an education as much as they seek to earn a credential that they can successfully leverage in a labor market. Sure, the former is supposed to beget the latter, but it’s a hurdle that’s easily, and often, leaped.

Knowledge learned is quite easily forgotten (ask any college graduate to retake one of their mid-term or final exams five years on and I’d almost guarantee a near-perfect failure rate) and our current structure for assessing knowledge gain is built around point-in-time assessments like exams, not repeated evaluation over time of comprehension and understanding. Indeed, one need look no further than Facebook or Twitter to find countless posts of college students bragging about never cracking open a book, attending class or cramming for an exam and pulling the minimum they needed to pass a class to see that it’s the credential that people value and carry with them years later, not necessarily the education.

We also know that there are plenty of low- to no-cost learning options available to people on a daily basis, from books on nearly every academic topic at the local library and on-the-job experience, to the television programming on the National Geographic, History and Discovery channels. If learning can and does take place everywhere, there has to be a specific reason that people would be willing to spend tens of thousands of dollars and several years of their life to get it from one particular source like a college. There is, of course, and again it’s the credential, because no matter how many years I spend diligently tuned to the History Channel, I’m simply not going to get a job as a high-school history teacher with “television watching” as the core of my resume, even if I both learned and retained far more information than I ever could have in a series of college history classes.

On the supply side, the elite colleges that are currently offering, or planning to offer, access to their world-class educations via MOOCs have become elite, in part, because they look to maximize degree completion and academic success by enrolling the very brightest and highest-achieving students. Some of this can be explained by simple supply and demand – there are only a fixed number of spots for a large number of applicants – but a lot has to do with how colleges leverage the peer effects that highly educated students contribute to the education process.

In other words, as economists tell us, students themselves are an important input to education. The fact that no school uses a lottery system to determine who gets in means that determining who gets in matters a great deal to these schools, because it helps them control quality and head off the adverse effects of unqualified students either dropping out or performing poorly in career positions. For individual institutions, obtaining high quality inputs works to optimize the school’s objective function, which is maximizing prestige.

If colleges care for something more than the quality of their inputs, it’s clearly the products they put out on the market. Information asymmetries greatly favor colleges over students – think of the used car dealer problem – which makes it possible to sell education on a promise in the short run but, over time, it’s the quality of those graduates and their career success that drives continued future demand.

In this light, colleges have a strong incentive to protect, or control, the quality of the degrees that they confer because successful graduates directly affect the institution’s prestige and the public’s perceptions about the value of its products. In the automobile industry, manufacturers impose exacting specifications on parts from third-party suppliers because those components directly affect their products’ value, yet we believe that colleges will pursue more relaxed admissions standards or increasingly accept someone else’s quality standards (i.e., credit transfer) and assume responsibility for the quality of the end product? It would be like demanding General Motors accept parts built to metric or U.S. customary standards (or made out of lower-grade metals), and then complaining when the engine seizes.

The fatal flaw that I referred to earlier is pretty apparent:  the very notions of "mass, open" and selectivity just don’t lend themselves to a workable model that benefits both institutions and students. Our higher education system needs MOOCs to provide credentials in order for students to find it worthwhile to invest the effort, yet colleges can’t afford to provide MOOC credentials without sacrificing prestige, giving up control of the quality of the students who take their courses and running the risk of eventually diluting the value of their education brand in the eyes of the labor market. Stanford was perfectly in the right for clarifying that the letters of completion professors wrote on behalf of students who finished the MOOCs were NOT a certification because the people who get those credentials will eventually reflect directly upon labor markets’ perceptions of Stanford’s quality. Handing out credentials to any Tom, Dick or Harry with an internet connection hurts the school and its students, but it also removes an important quality control mechanism and runs counter to the basic economics of how we know universities operate.

Is there a model out there, or an institution/student mix that could effectively utilize MOOCs in such a way as to get around this flaw? It’s hard to tell. Recent articles on Inside Higher Ed have suggested that distance education providers (like the University of Maryland’s University College – UMUC) may opt to certify the MOOCs that come out of these elite schools and bake them into their own online programs. Others suggest that MOOCs could be certified by other schools and embedded in prior learning portfolios.

In both cases, though, the fundamental problem remains the same: if the credential is what truly matters to both students and labor markets, what rationale is there to pursue an MIT or Stanford education (via a MOOC) if your career prospects will be based on a credential from Western Governors University or UMUC? Why take a semester, or a year, or three years of classes at Harvard just so you can transfer all of your credits and finish a degree program at Fort Hays State University?

The MOOC is certainly novel; nevertheless, it’s remarkably difficult to see how selective institutions can provide a mass consumption product that truly meets the purpose for why people get their learning from colleges to begin with. The fact that the nation’s most prestigious higher education institutions are pioneering this effort is to be applauded and expected, as these schools have historically been at the forefront of many of higher education’s “disruptions” for generations. They have the wealth to support innovative ideas, the talent pool to execute on them and, some would say, the cognizance to recognize their role in keeping the industry dynamic.

Still, what our elite higher education institutions have produced in the MOOC looks and feels like one of Ford Motor Company’s futuristic concept cars – something that provides a vision for how tomorrow might look, or which includes niche features that may be built into near-term models, but in its current form is simply not road-ready.

Next Story

More from Views