Submitted by David Touve on September 11, 2012 - 3:00am
In recent months, many of the most prominent research universities announced forays into free online courses. As a greater number of these universities go online with such free education platforms, the nature of the market for — and even the meaning of — a college degree could change in both subtle and significant ways.
Behind the screens, beyond the more collaborative desire to educate the world, a rather complex sort of competition may be playing out. Aside from the question of competition, however, is the question of what the classification of these online programs signals in terms of our beliefs about the purpose and value of a college degree, as well as the qualifications for such a degree.
On the one hand, universities or their partnered courseware platforms describe these MOOC experiences as analogous to classroom-based course experiences, in terms of either the academic rigor or at least the capacity to assess mastery of the course material. For example, edX describes the rigor of its online courses as the same as that of the partnering institutions. Coursera, citing a 2010 meta-analysis conducted by the Department of Education, claims that online learning is at least as effective as learning in face-to-face classroom settings.
On the other hand, those universities now experimenting with MOOC offerings are quick to clarify that they will grant course credit or college degrees only to those students who first pass through the highly selective admissions process, which occurs before these students ever register for a course — online or on-campus.
As a result, the nature of these recent experiments in massive and open online courses risks triggering a paradox in certain galaxies of the higher education universe: evidence of mastery in university coursework will warrant only a certificate, while evidence of mastery in work prior to university coursework will determine the degree. Simply stated, the line between an online certificate and a degree from any particular institution shall be drawn by the admissions office.
This paradox was expressed in point-blank terms by MIT’s news office, in December 2011, within the original FAQ for the MITx program:
"Credentials will be granted only to students who earn them by demonstrating mastery of the material.... MIT awards MIT degrees only to those admitted to MIT through a highly selective admissions process."
Expressing, in mathematical terms, the degree-does-not-equal-certificate logic:
“Course” and "Mastery" cancel each other out, and so:
(+) Admission Selection = Degree, while
(-) Admission Selection = Certificate
Perhaps as evidence of the danger presented by this paradox, the edX FAQ now makes no explicit reference to the qualifications — such as a lack of equivalence in subject mastery — that distinguish a degree from a certificate. Frankly, however, the resolution of this paradox cannot be resolved by simply not mentioning it.
Unfortunately, the engineered distinction between certificates and degrees mimics a much deeper and unsightly impression for which the market for these same prestigious universities is widely criticized: the inputs to education trump the outputs of education. We rank, and even respect, universities according to the relative metrics of standardized test scores and dollars spent on research (inputs) rather than measures of classroom experience or subject mastery (outputs).
As larger populations of students in the higher education universe complete increasing proportions of their coursework online, however, some resolution to the certificate versus degree paradox becomes unavoidable. The line that could previously be drawn between wholly online degree programs and wholly offline programs fades.
Furthermore, as larger populations of students complete increasing proportions of their coursework through the same, or extremely similar courseware platforms, our ability to ignore these MOOCs as the means to measure at least one dimension of the outputs of higher education fades as well. In other words, we will have to come to terms with the implications of our measures of mastery (e.g., when the only students who aced a Stanford University course in artificial intelligence were students who were not attending the university).
Just as our initial characterizations of the Internet as seemingly antisocial transitioned to an awareness that this online space was social in its own ways, so to might this distinction between online and offline education transition to a recognition that these two environments simply provide different venues for learning, each venue leading to certain subject mastery in its own ways.
Frankly, it’s time to resolve this paradox, and the sooner the better.
If a well-attended and open online course offered by a prominent university is somehow different from the associated on-campus education in terms of the level or type of mastery that can be achieved, then we should just say so and treat this difference as such. Subject mastery in a MOOC environment may be a necessary but not yet sufficient condition for "mastery," at least in certain galaxies of higher education.
In fact, perhaps the mastery we are ultimately hoping for from the range of galaxies in the higher education universe is more than the ability to answer 50 questions correctly. Instead, our ultimate goal is to develop a capacity to convert the implications of those answers to new questions, new ideas, and new inventions — dynamic sources of impact. Developing and supporting this dynamic capacity may not scale in the same way that MOOC education can.
If, however, there is no difference between the level and type of mastery that can be reached online versus that which might be attained on campus, then we should speak and act as if these two venues are indeed equivalent — if not in experience then at least in terms of the outputs, regardless of inputs.
Most importantly, however, we should resolve the paradox that emerges from this debate over MOOCs, wherein the substance — whether chunks of matter or ideas or right answers or genuine insights — that determines whether a student earns a university degree rather than a course certificate would be in the selection of that student through admissions standards rather than in the content and quality of the education or the impact of that education as measured through the student’s experience, accomplishments, or dynamic capacity to act upon and even develop new knowledge.
David Touve is assistant professor of business administration at Washington and Lee University.
The time has come to ask the question: When will we see the complete digital transformation of higher education in the United States?
The need for the shift to digital are painfully clear: Grades are lagging, students aren’t graduating, and those who do earn a degree often don’t have the skills that employers want. While digital learning won’t solve all these problems, we need to find ways to drive students’ performance to help them recoup their college investment, and I believe that digital represents the fastest and best option.
With these needs in mind, I’m willing to put my stake in the ground.
As I see it, the publishing industry needs to do all it can to ensure that within 36 months, higher education in the U.S. will be completely digital. I’m not talking about a slight or even gradual increase in e-book adoptions or the use of adaptive learning. I’m talking about a total transition from a reliance on print textbooks to a full embrace of digital content and learning systems. Aside from the college library, you hopefully won’t be able to find a printed textbook on a college campus in three years. And if you are, we should all be disappointed.
To date, the rate of adoption of digital course materials has been slower than most would have expected. Only around 3 percent of students today purchase e-books over print, and less than half of my company’s customers come to us for digital.
There are a few reasons why I think we haven’t seen greater uptake. For one, education is a high-stakes endeavor for students, with important outcomes riding on it. While students may be willing to switch to digital in some aspects of their lives, when it comes to studying, they often want to stick with what they know. There’s also the fact that until recently, the user experience offered by e-books and other digital technology just hasn’t been very good. A glorified PDF of a printed page is not compelling to students. Finally, and I think most importantly, the value proposition of digital to students and institutions hasn’t been clear. Many students and colleges are unaware of how digital can enhance the learning experience beyond making it more portable and affordable – and provide real results.
For such a big transition — a leap forward, really — three years may seem like a short period of time. In today’s technology landscape, it’s an eon. Thirty-six months ago the iPad didn’t exist. Now, 65 million units later, it has changed the way we consume, create and share information. If that number isn’t big enough for you, try this one: 760 million — that is how many tablets Forrester expects will be in use by 2016.The adoption of these devices is happening at a lightning rate, and the inevitability of falling prices will make them even more accessible to students.
Student attitudes toward digital in the classroom are also evolving. Studies show that after using technology in an education setting for only a short time, students are realizing that they can’t live without it. As the design of digital education materials and technology continues to improve, students’ affinity for it will only grow.
It’s one thing for digital content and learning systems to offer a nice user experience and some interactive features. It’s another to help make meaningful gains in student performance.
Today’s digital technology already meets this challenge. Super-adaptive systems such as McGraw-Hill’s LearnSmart, a digital homework tutor that adapts to each student’s individual knowledge levels and creates custom study paths, are making a dramatic impact on student outcomes by scaling a personalized learning experience. An effectiveness study of LearnSmart showed that students using the program have seen significant improvements in pass rates, retention rates and increases in their overall academic performance. Results like these – whether they come from McGraw-Hill or other leading companies in our field – are something we just can’t afford to ignore, especially in light of the rising costs for higher education and falling student achievement.
If you want to get a sense of how confident we are in the effectiveness of this technology, take a look at a recent pay-for-performance partnership McGraw-Hill Education formed with Western Governors University. This partnership ties the fees we receive for learning materials to the grades of the students using those materials in class.
For professors – the foundation of our higher education system – digital provides an important collateral benefit. Working with students who come to class prepared and have an active interest in what they're learning allows them to spend less class time reviewing the basics and more time exploring advanced concepts. This is the type of teaching that leads to higher-order learning, and it’s the type of teaching that professors love doing the most.
When we talk about innovation, it’s usually in the context of technology. But where innovation is really shining through in education is in the models that learning companies are developing with colleges and universities to provide digital technology to students more affordably.
Colleges such as Indiana University and the University of Minnesota are partnering withlearning companies to ensure that all students have access to the learning materials for their courses at a price that’s substantially lower than what they’re used to paying – as much as 60 percent less than a print textbook. At a price that’s comparable with a used print book, students receive all of the benefits of going digital: portability, instant access to course material on the first day of class, and seamless integration with adaptive learning systems that provide personalized instruction.
While the transition to an all-digital learning materials experience may not always be comfortable, it’s one that is a necessary part of the solution. Technology isn’t just about improving access or engagement, it’s about achieving what should be the main goal of our higher education system today: improving student performance.
If my 36-month timeline sounds ambitious, that’s because it is. We have the tools to help solve one of the greatest challenges of our times – we just have to put them to use.
Brian Kibby is president of McGraw-Hill Higher Education.
When’s the last time an ice deliveryman visited your home? Have you ever talked to a telephone switchboard operator? Thanks to new technologies, these once-common occupations passed into history many years ago now. Bank tellers and travel agents are not completely obsolete, but substantially fewer people are employed in these lines of work than in the past for similar reasons.
Will new developments in Internet-based communications technology do similar things to college professors? Perhaps people like me will face the same trouble finding employment that newspaper reporters or piano tuners face nowadays. Or perhaps MOOCs will eliminate the need for professors almost entirely, allowing students to flock to courses offered by a smattering of "super-professors" while computers, graduate students and adjuncts do all the grading that once occupied so much of an analog instructor’s time.
I don’t know whether the Internet will make college professors obsolete, but then again nobody else does, either. Yet this fact has not prevented the rise of a cottage industry of pundits who gleefully suggest that faculty in every department of the modern university are somehow headed for the scrap heap. Some of these pundits seem to welcome that possibility because they expect that the cost of a college education will decrease with fewer professors collecting what they perceive to be hefty salaries, and they think that’s good for society. Some of these people seem to welcome this possibility because they just hate college professors. We are perceived as elitists, and everybody likes to watch elitists get their comeuppance, except the elitists themselves.
While countless people try to predict the future of higher education based on the technologies of the present, less interest exists about the effect of all these predictions on higher education today. While reading the online educational technology press for the sake of my blog, I sometimes feel like that old man in "Monty Python and the Holy Grail" who has to tell the guy clearing out the bodies of plague victims that he’s not dead yet.
To my mind, this feeling is no accident. As the saga surrounding Teresa Sullivan’s presidency at the University of Virginia has clearly demonstrated, faculty are capable of mounting fierce resistance to unwanted technological changes under the right circumstances. The e-mails leading up to Sullivan’s initial firing demonstrate that U.Va.'s Board of Visitors was steeped in press clippings that treated the transition to online education as an inevitability. That attitude goes a long way toward explaining the board’s now legendary heavy-handedness. They wanted to ride the crest of a wave that supposedly well-informed people were all telling them is already coming.
For technology companies that stand to profit by disrupting higher education, treating the transition to an online future as a fait accompli serves as a very effective business strategy. By continually reinforcing the idea that traditional higher education is way behind the times, they gather public support for costly online initiatives that might not otherwise go forward. Equally importantly, this kind of rhetoric infects faculty with a sense of learned helplessness. Why try to fight the inevitable when we have so much else to worry about in our busy lives already?
Personally, I go back and forth between optimism and despair about the future of my profession. Sometimes I think that enough support exists on enough campuses that the kind of teaching I do now will persist well past my retirement because students will still value the personal touch that proximity makes possible. Sometimes I feel like I’m living inside of Frank Donoghue’s higher education classic, The Last Professors. Donoghue’s primary concern in that book was the corporate culture of the modern university. The jargon employed by U.Va. board members suggests how well the maturation of online education complements the destruction of traditions caused by that ideology in other aspects of campus life.
Perhaps my somewhat schizophrenic attitude toward the possibility of my own obsolescence comes from the fact that whether the Internet makes college professors a thing of the past doesn’t depend upon the professoriate. It depends upon students, and to an equal extent it depends upon society at large.
Nobody can dispute that online education has advanced far enough that it is now possible to learn a wide range of subjects at home in your pajamas through your computer. The question is whether this kind of learning will be acceptable to most students in the future, and perhaps more importantly whether it will be acceptable to the people who’ll employ them. I used to think that someone on the other end of a computer screen could never teach history as well as I can in person. I still think that’s true, but as Clayton Christensen has argued, an online education doesn’t have to be superior to the status quo in order to make my current job obsolete. The bad can drive out the good under certain circumstances, such as when the price of the higher-quality product is too expensive for most consumers to afford it.
Whether you’re an enthusiastic booster of online education or an informed skeptic like me, there is no question that faculty need to understand developments in the educational technology industry, if for no other reason than for their own self-preservation. If transformational change is indeed inevitable, faculty should assert their prerogative as teachers in order to make sure that the quality of higher education is not seriously degraded by this metamorphosis. By doing so, maybe they can carve out a place for themselves in a more efficient future. If transformational change is not inevitable, then for heaven’s sake don’t let the vultures who want to profit from picking at the corpse that was once your career destroy it without a fight.
I can tell you from personal experience that following developments in educational technology can be thoroughly exhausting. I’m sure plenty of you who’ve tried it yourselves would prefer to never encounter the word MOOC again in your entire lives. However, living in ignorance is probably the worst thing you could do. No matter how the Internet impacts higher education, we faculty need to play a role in the debate over its strengths and weaknesses for the sake of our students. If we happen to save our own jobs in the process of doing so, then that’s all for the better.