The recent announcement from the California State University System regarding its embrace of edX massive open online courses (MOOCs) is interesting and depressing at the same time. As with many aspects of the MOOC phenomenon, it comes packaged with good and bad aspects bundled up together. Instructors will offer a "special 'flipped' version of an electrical engineering course ... where students watch online lectures from Harvard and MIT at home." So the good is the flipped part because it's more interactive and dynamic and there's less lecture-based didacticism in the classroom due to watching videos at home? Really? The 1970s just called: they want their Open University courses back.
This model perhaps moves the Cal State system forward as it offers more accessibility to content for working adults in a hybrid format. I wish they would just step away from the MOOC terminology, which is, let’s be honest, copying and lending out a videotape in another name. MOOCs have been so beaten up and stolen for self-serving means that the original premise has been lost. As Stephen Downes, one of the forefathers of original MOOCs, stated in a recent blog, "These arguments miss the point of the MOOC, and that point is, precisely, to make education available to people who cannot afford to pay the cost to travel to and attend these small in-person events. Having one instructor for 20-50 people is expensive, and most of the world cannot afford that cost."
The MOOC spirit has been eroded by institutions and individuals who see an easy way to sound (or just seem) tech-online savvy. MOOCs are being used by many institutions to avoid actually having to discuss issues like ownership of curriculum, scalability and strategic online growth. In a (MOOC) swoosh, difficult governance issues regarding intellectual property, scalability and ownership are gone. Corrupted MOOCs circumvent the need for anything other than talking (lecture-style) to a camera with the hope that the "nice young guys and gals at CoursEdXra" drop me into a backdrop of the Parthenon and/or animate the background with pen cast versions of napkin sketches. There’s no building of an online community, facilitation of discussion threads, not even grading of papers, just, "I’m done — here’s my MOOC!"
MOOCs were originally intended to educate the Masses (M): hundreds of thousands who “cannot afford to enroll or travel to classes.” They were all Open (O): Open Content provided or supported by Saylor.org, Creative Commons and others. Now Open no longer means open resources — it has been unofficially changed to mean "open to anyone." Don’t get me wrong. Being more available to more people isn’t in itself a bad thing, but it does move the focus away from the original intent, which was to provide free, quality educational materials. The second O stands for Online — unless it’s a hybrid offered in a flipped classroom in which students have watched a video before coming to class (sigh). C = Course. Well, I guess one out of four is not bad if 10 percent retention is acceptable.
Original MOOCs (oMOOCs) were free, or at least extremely affordable, fully online, well-crafted and contained a lot of interesting pedagogy and instructional design. The target demographic was the underserved, both nationally and internationally. Per Downes, they were "not designed to serve the missions of the elite colleges and universities...." but rather "designed to undermine them, and make those missions obsolete."
Hijacked MOOCs are flagship (institution)-led, starting to cost (increasingly), often hybrid, faculty headshot to camera, tech sophistication layered on, little-to-zero impact on faculty member revisiting / learning? pedagogy (in any format) and not very massive. They're mostly taken by education technologists, already-qualified individuals and Tom Friedman.
It’s the strategic analysis and "nuanced discussion" that I want us all back to. Proper MOOCs may work for some, others may just choose to use open online materials and some may even have a mission to support affordable education for underserved communities (my favorite). But let’s not kid ourselves. Co-opting a MOOC label does not make an offering edgy. Get strategy and rationale nailed first, worry about the acronym later.
Kevin Bell is the executive director for online curriculum development and deployment at Northeastern University's College of Professional Studies. This essay is adopted from a posting at the blog Aspire.
Massive open online courses (MOOCs) have captured the nation’s imagination. The notion of online classes enrolling more than 100,000 students is staggering. Companies are springing up to sponsor MOOCs, growing numbers of universities are offering them, and the rest of America’s colleges are afraid they will be left behind if they don’t.
But MOOCs alone are unlikely to reshape American higher education. When history looks back on them, they may receive no more than a footnote. However, they mark a revolution in higher education that is already occurring and which will continue.
America is shifting from a national, analog, industrial economy to a global, digital, information economy. Our social institutions, colleges and universities included, were created for the former. Today they all seem to be broken. They work less well than they once did. Through either repair or replacement — more likely a combination — they need to be refitted for a new age.
Higher education underwent this kind of evolution in the past as the United States shifted from an agricultural to an industrial economy. The classical agrarian college, imported from 17th-century England with a curriculum rooted in the Middle Ages, was established to educate a learned clergy to govern the colonies. This model held sway until the early 19th century.
In the years before the Civil War, the gap between colleges and society grew larger. European higher education modernized, creating models that would inspire America to grow our own. Innovations, mostly small, were attempted; many failed. During and after the war, the scale of experimentation increased with the founding of universities such as Cornell University, Johns Hopkins University and the University of Chicago a few decades later. Other institutions, such as Harvard University, remade themselves. The innovations spread. By the mid-20th century a new model of higher education for an industrial era coalesced. It was codified in California’s 1960 master plan, balancing selectivity with access and workforce development.
This transition brought new institutions that better met the needs of an industrializing America.
An entity called the university was imported from Germany, with what would become a mission of teaching, research and service. It offered instruction in professions essential to an industrial society, organized knowledge into relevant specialties, and hired expert faculty in those areas. It not only transmitted the knowledge of the past, but advanced the frontiers of knowledge for the future.
The federal government created the land-grant college to bridge between the old agrarian America and the emerging worlds, agrarian and industrial America. Now found in all 50 states, the land-grant college was designed to provide instruction in agriculture and the mechanic arts without excluding classical studies.
Specialized institutions emerged. Some, like the Massachusetts Institute of Technology, were modeled on the European polytechnics; they promoted industrial science and technology and prepared leaders in these fields. Others, the normal schools, sought to provide more and better teachers as the evolving economy demanded more education of its citizenry.
The two-year college — originally called a junior college, later a community college, sometimes Democracy’s College — was initially established to offer lower-division undergraduate education in the local community.
As these institutions emerged, the curriculum changed. Graduate studies were introduced. New professional schools in fields like engineering, business and education became staples. Continuing education and correspondence courses were added. Elective courses and majors arose. Disputation, recitation, and memorization, the teaching methods of the agrarian college, gave way to lectures, seminars, and laboratories.
The colleges that persisted adopted many of the era’s changes, and the classical curriculum largely disappeared.
This is the history of higher education in America. Change has occurred by accretion. The new has been added to the old and the old, over time, modernized. Change occurs with no grand vision of the system that the future will require. New ideas are tried; some succeed, many fail. By successive approximations, what emerges is the higher education system necessary to serve the evolved society.
Social change is a constant, and so is the need for higher education to adapt to it. When the change in society is deleterious, as in the McCarthy era, it is the responsibility of higher education to resist it and right the society. It is a natural process, almost like a dance. However, in times of massive social change like the transformation of America to an information economy, a commensurate transformation on the part of higher education is required.
We are witnessing precisely that today. MOOCs, like the university itself or graduate education or technology institutes, are one element of the change. They may or may not persist or be recognizable in the future that unfolds.
What does seem probable is this. As in the industrial era, the primary changes in higher education are unlikely to occur from within. Some institutions will certainly transform themselves as Harvard did after the Civil War, but the boldest innovations are likelier to come from outside or from the periphery of existing higher education, unencumbered by the need to slough off current practice. They may be not-for-profits, for-profits or hybrids. Names like Western Governors University, Coursera, and Udacity leap to mind.
We are likely to see one or more new types of institution emerge. As each economic and technological revolution creates new needs for higher education, unique institutions emerge to meet them. In the agrarian era, only a tiny percentage of the population needed higher education, and the college served these elite few. When industrial America required more education, more research, and mass access to college, two major institutions were established: the university and the community college.
The information economy, which requires a more educated population than ever before in history, will seek universal postsecondary education and is likely to create new institutions to establish college access for all at low cost. These institutions will operate globally, not locally, which will dictate a digital format. Because information economies emphasize time-variable, common outcomes — unlike the industrial era’s common processes and fixed times (think assembly lines) — universal-access institutions will offer individualized, time-variable instruction, rooted in mastery of explicit learning outcomes. Degrees and credits are likely to give way to competency certification and badges.
Traditional higher education institutions — universities and colleges—will continue, evolving as did their colonial predecessors. Their numbers will likely decline. At greatest risk will be regional, part-time commuter universities and less-selective, low-endowment private colleges, particularly in New England, the Mid-Atlantic, and the Midwest. The future of the community college and its relationship to the universal-access university is a question mark. It is possible that sprawling campuses will shed real estate in favor of more online programs, more compact learning centers and closer connections with employers and other higher education units.
In this era of change, traditional higher education—often criticized for being low in productivity, being high in cost, and making limited use of technology — will be under enormous pressure to change.
Policy makers and investors are among those forces outside of education bringing that pressure to bear. It’s time for higher education to be equally aware and responsive.
Arthur Levine, a former president of Teachers College, Columbia University, is president of the Woodrow Wilson National Fellowship Foundation.
There’s a legendary story about Anne Sexton’s learning how to write a sonnet by watching I.A. Richard’s educational-television series in the late fifties. I’ve thought about that fairly often while reading the daily stories on MOOCs. In the Sexton/Richards instance, there was a fortuitous electronic meeting of an excellent teacher who saw possibilities in the then “new” technology of television and a motivated student who was ready to write as if -- and according to her this was indeed the case -- her life depended on it.
That hyperbolic tone of the last sentence above -- a tone that readers of Sexton’s later poems and interviews are already familiar with -- is also the tone of a good many declarations about MOOCs.
Thomas Friedman’s latest column “The Professors’ Big Stage” is a case in point. His piece on “the MOOCs revolution” is riddled with contradictions, shallow thinking -- and an error in basic arithmetic.
Friedman begins by excitedly informing us that he’s just returned from a “great conference” sponsored by M.I.T. and Harvard on “Online Learning and the Future of Residential Education.” He doesn’t explain why he had to attend in person, or question why the conference wasn’t online, but he adds his own title, “How can colleges charge $50,000 a year if my kid can learn it all free from massive open online courses?" That premise, it soon becomes clear, is moot.
More on Friedman and MOOCs
"Thomas Friedman has as much
credibility on education as I do on
dunking a basketball," writes
As Friedman goes on to extol the virtues of using MOOCs as supplements for traditional courses and programs, MOOCs then become an example of preliminary programmed learning -- the sort of thing that community colleges have been doing in terms of remedial aid for quite a while. Publishers like Bedford/St. Martin’s have offered online drills for years. And if the MOOC is tied to an accredited college’s course, then Junior and his dad are still paying for Junior’s education.
According to Friedman, students enrolled in a hybrid course at San Jose State, which combines M.I.T.’s introductory online Circuits and Electronics course with traditional in-seat class time, have done quite well: “Preliminary numbers indicate that those passing the class went from nearly 60 percent to about 90 percent.” There’s even better news for the students involved in that course than Friedman’s assessment: he sees the improvement as one-third; in fact, a jump from 60 percent to 90 percent means the number of students passing the class increased by one-half, or 50 percent.
We should note that this is an argument for remedial preparation and/or immersion in a subject -- not necessarily an argument for online versus in-seat instruction.
And that, of course, is just one class. Friedman sees MOOCs as going far “beyond the current system of information and delivery -- the professorial ‘sage on the stage’ and students taking notes, followed by a superficial assessment. This description not only fails to describe adequately the current system but also ironically illuminates some of the biggest problems with MOOCs. Given the scale of MOOC courses, the only kinds of student assessment that can be accomplished are superficial. And we will have to hope that some enrolled students, unlike Friedman, still believe in note taking. The MOOC lecture system, however, puts that sage right back on the stage -- as Friedman’s very title for his op-ed indicates.
Moreover, his discussion of Michael Sandel, the Harvard professor whose Justice course will have its American debut on March 12 as the first humanities offering on the M.I.T./Harvard edX online learning platform, focuses not on aspects of the course but on Sandel’s old-fashioned appearances on the lecture circuit.
Sandel, whose course has been translated into Korean and shown on national South Korean television, recently traveled to Seoul (again, why?), where he lectured “in an outdoor amphitheater to 14,000 people, with audience participation.” There was no indication as to how long the Q&A session ran.
Academicians often fall prey to magical thinking; at my former college, each time we hired a new provost (10 in my 16 years), we were certain that this was the one who would be our savior.
Each time we created a new central curriculum (three in my 16 years; the final stage just before I left was to exempt adult students from completion of the college’s core requirements), we were certain that this was the answer. Smaller, struggling colleges may see offering licensed supersized online courses as cost-saving -- an escape from the situation they currently find themselves in, in which every small school worries about going online or bust.
Many of these colleges turned to creating their own individual online courses -- already being referred to as “traditional online courses” -- as a solution, only to find that the expenses have outweighed the successes: they are costly in terms of faculty training, serve very small audiences (often sitting only a building or two away), and put severe strain on IT departments.
Online consortiums in which struggling schools have banded together have also proved to be problematic; I am thinking in particular of one class that I was asked to review for my former college, which was a member of such a consortium: an accelerated multi-genre writing class, which asked students to write one poem, one short story, and one essay over a period of five weeks. The "final project" consisted of one additional work, in the students' choice of genre. It was thus possible to fufill 50 percent of the course requirements with two haiku.
MOOCs, of course, have their ur-versions, which include not only Henry Ford’s production line and the rise of fast food, but massive online delivery experiments in the mid-1990s, online remedial drills, large introductory-course in-seat lectures, Sunrise Semester, and the Great Lecture Series, but also the 19th-century lecture. And possibly there was someone who asked Harvard for credit for attending Thoreau’s lecture on “Society” -- or for attending a lecture by P. T. Barnum.
Friedman does note, near the end of his exhortatory column, that “We still need more research on what works.”
Indeed. Along with the return of the sage on the stage, this newest educational/industrialized development has brought along with it -- no surprise to anyone who has taught a traditional online class, a class with online components, or a traditional in-seat class -- some old concerns: problems with technology; problems with underprepared and unmotivated students; problems with class participation in discussions (one sage walked off the stage); and concerns about retention and plagiarism.
Assessment will continue to be one of the biggest concerns: both assessment of the overall course and assessment of any student work that goes beyond the level of a drill. Financial issues will come in to play, as will work force issues. Hierarchical divides among students, faculty members, and institutions will not disappear.
Finally, there is a dynamic in a traditional classroom that MOOCs simply can’t provide. In small, in-seat courses and workshops, students discover that they are part of a community, in which each person has a responsibility to contribute and the reward of personal interaction. Such courses allow for flexibility, Socratic questioning, and serendipity. Face-to-face meetings and small-group dynamics are important parts of education and socialization. And they provide an essential break for students from their hours of online gaming, posting and browsing.
One other analogy that comes up in discussions of MOOCs is “correspondence course.” It’s considered a dirty term, and yet, it may be an accurate description as thousands of students and piecework adjuncts labor at their solitary tasks.
And there may be something to be learned from a fictional account of a correspondence school: J. D. Salinger’s “De Daumier-Smith’s Blue Period.” The alienated protagonist concludes that “We are all nuns” -- working silently, separately, seeking salvation.
Carolyn Foster Segal is a professor emeritus of English at Cedar Crest College. She currently teaches at Muhlenberg College.
The top of the annual performance review form at my university has a blank space for us to list any additional education we obtained during the previous year. I’ve never filled that space in before, but that will change in my review for 2012 because I spent part of my sabbatical last fall as a student in a massive open online course (or MOOC).
I'm an American historian by training, but ever since I left graduate school a global perspective has become increasingly important for historians of all kinds. That’s why I decided to get some free professional development in world history, courtesy of Coursera. I learned a lot of interesting and useful specific factual information from the MOOC instructor (or superprofessor, as the lingo goes) that has already helped me become a better teacher and scholar.
But I didn’t just listen to the lectures. Like any other student (since that’s what I was), I also wrote out all the assignments and helped grade papers written by my peers in class. This peer grading process differs from peer evaluation (which I use in class all the time) since students not only read each other’s work, they assign grades that the course professor never sees. Professors in the trenches tend to hold their monopoly on evaluating their students’ work dearly, since it helps them control the classroom better by reinforcing their power and expertise. On the other hand, superprofessors (and the MOOC providers that teach for them) have begun to experiment with having students grade other students out of necessity since no single instructor could ever hope to grade assignments from tens of thousands of students by him or herself.
With MOOCs in their infancy, few precedents exist for designing online peer grading arrangements for humanities courses. For this reason, I don’t intend to criticize my superprofessor’s choices here. However, I do have to describe some of the peer grading process from my class in order for my critique of peer grading in general to make sense. All students in the MOOC were supposed to write six essays between the start of the course and its end. For each assignment, we could choose one of three single-sentence questions to answer in 750 to 1,000 words. The week after we submitted those essays, we were supposed to grade the essays of five of our peers with respect to their argument, evidence and exposition, and leave comments. If you didn’t grade the essays your peers wrote, you didn’t get to see the grade you earned.
With respect to the grades I earned, I think my peers graded my essays just right. The grading scale in our MOOC went from zero to three. When I already knew a fair bit about the topic of the question that I answered or I tried very hard to write the best essay I could, I earned mostly threes from my peers. When I didn’t try very hard, I tended to get twos. While I listened to all my superprofessor’s lectures fairly closely, I never read the recommended textbook, which also undoubtedly hurt my scores.
For me at least, the primary problem with peer grading lay in the comments. While I received five comments on my first essay, for every subsequent essay I received number grades with no comments from a minimum of two peers and as many as four. In one case, I got no peer grades whatsoever. That meant that the only student who evaluated my essay was me. Every time I did get a comment, no peer ever wrote more than three sentences. And why should they? Comments were anonymous so the hardest part of the evaluative obligation lacked adequate incentive and accountability.
I read in The New York Times a few weeks ago that a study had begun to examine whether peer grades would match the grades assigned by professors and teaching assistants in one sociology MOOC. While that would prove an impressive feat if true, it would in no way validate the process of peer grading. Learning, as any humanities professor knows, comes not through the process of grades but through the process of students reading comments about why they got the grades they got. That’s how students find out how to do better next time.
To be fair, the course included a good set of instructions about how to grade a history essay linked from the course homepage. Unfortunately, there was no way for the superprofessor to force students to read those instructions, and due to the inevitable pressure to cover as much world history as possible, he never discussed how to grade in any of the class lectures. How could he? Good grading technique is difficult enough for graduate students to learn. Because of the size of the course I think I can safely assume that many of my fellow MOOC students inevitably had no history background at all, yet the peer grading structure forced them to evaluate whether other students were actually doing history right.
The implicit assumption of any peer grading arrangement is that students with minimal direction can do what humanities professors get paid to do and I think that’s the fatal flaw of these arrangements. This assumption not only undermines the authority of professors everywhere; it suggests that the only important part of college instruction is the content that professors transmit to their students. How many of the books you read in college can you even name, let alone describe? It’s the skills you learn in college that matter, not the specific details in any particular class, particularly those outside the major.
Over the course of my career, I have increasingly begun to spend much more time in class teaching skills than I do content. Some of this has been a reaction to encountering students who do not seem as prepared for reading or writing college-level material as the students I had back when I started teaching. However, I have also come to believe that teaching these skills is much more important than teaching any particular historical fact. After all, it really is possible to Google nearly anything these days.
Certainly good students can do a good job grading peer essays and I got a few short but insightful comments on the papers I wrote for my MOOC. Even if all of my comments had been less than helpful, I didn’t come into the MOOC process seeking to improve my writing skills. I wanted to learn new information, and many other students who engaged the material the same way that I did probably felt the same way.
Students like me won’t be the ones who’ll suffer because of peer grading. Its victims will be the future students who take MOOCs to earn college credit at increasingly cash-strapped universities. Who will teach them how to write well? Who will monitor their progress through the peer grading assignments? Who will help them understand that history is as much about argument as it is about facts or that literature can be appreciated on multiple levels? While other students can certainly teach other students some things, they can never teach students everything that a living breathing professor can.
Education startups like Coursera are experimenting with peer grading not because it is the best way for students to learn history or English, but because it is the only way that the MOOC machine can ever run itself in a humanities course. If MOOCs incurred high labor costs the same way that colleges do, those startups would never be able to extract a profit from those classes. While that’s a legitimate concern for Coursera’s venture capital investors, everyone else in academia – even the superprofessors – should give more weight to purely educational concerns.