At 30 years old, I definitely consider myself part of the Facebook generation. Zuckerberg’s brainchild hit the ‘net when I was a senior in college, and by then I was already well acquainted with e-mail, chat rooms, text-messaging, and all the multifarious precursors to today’s social media. I text, I post, I chat, I even snapchat: in these respects, I’m an utterly unremarkable member of my society.
But I also happen to be a college professor and a molder of young minds. And, far from indulging the technology-driven spirit of the times, I make my students work as students have always worked. They read Seneca, Pascal, Tolstoy, and Schopenhauer. They are obliged to turn in papers by hand; they must come to office hours to speak with me about their grades; they are even, and this is most anachronistic of all, required to attend class. Physical presence is key to every aspect of their learning experience, be it my hovering, breathing presence in the classroom or the office, the cohort of 30 or so warm bodies that shows up for lecture twice a week, or the more abstract form of embodiment conveyed by the weight of a book.
To believe certain commentators, however, this embodied notion of learning is on its way out in American higher education. Writing for The American Interest’s January/February 2013 edition, the recent Yale graduate Nathan Harden offers the following ominous prognostications about the future of university instruction in our digital age:
In fifty years, if not much sooner, half of the roughly 4,500 colleges and universities now operating in the United States will have ceased to exist. The technology driving this change is already at work, and nothing can stop it. The future looks like this: Access to college-level education will be free for everyone; the residential college campus will become largely obsolete; tens of thousands of professors will lose their jobs; the bachelor’s degree will become increasingly irrelevant; and ten years from now Harvard will enroll ten million students.
On Harden’s account, one of the principal reasons for this portended transformation, which is already being partially implemented by such institutions as Harvard and MIT, is that the cost of college is increasingly out of proportion with its perceived economic benefit. As the American job market has become more competitive, the cost of a degree has increased, and only the most naïve of students still believe that a college education is a universally redeemable ticket to middle-class prosperity. The weighing up of costs and benefits involved in earning a college degree will lead inevitably to a re-evaluation of the current higher education model. Luxury residence halls, face-to-face interaction between professors and students, ivied brick walls -- these will all be things of the past once the much-heralded education bubble finally bursts. What will replace them are massively populated, inexpensive online courses and lectures, prerecorded by the very best lecturers and administered by those hordes of professors and other academics not quite sexy or charismatic enough to warrant virtual celebrity.
To anyone who thinks Harden’s predictions are a little too ambitious (not to mention deeply disturbing, at least for college professors who don’t fancy the idea of working in a grading factory), don’t worry -- they most likely are. What Harden forgets -- and indeed, what just about everyone prophesying the eclipse of face-to-face interaction in a virtual world forgets -- is that human beings are, above all else, bodies, and that to lead full, happy, and meaningful lives, we need other bodies. Let’s consider the following examples of how technologies of virtualization have failed to triumph over our species’ thirst for physical presence.
1. The Giant Head. Some older readers may recall a famous article in Reader’s Digest from the late 1950s featuring an illustration of a massive human head connected to miniscule arms and legs. What was the thesis of that article? The tech junkies of the time believed that in the future technology would become so advanced that human beings would no longer need to use their bodies, leading to a swelling of the brain and a shriveling of our appendages. Many also foretold a time when food supplements would replace food. Wouldn’t it be great, they asked, if instead of spending hours preparing and eating meals, we could nourish ourselves in just a few seconds? No one at the time seemed to consider that human beings might not want to do any of this — that we might enjoy using our bodies, eating, and the like. In the half-century since these predictions were made, restaurants have proliferated, and heads haven’t grown one bit.
2. Live Theater. When I was a kid, there were hardly any live theaters in my hometown of Bakersfield, Calif. Now there are about ten. Many people used to believe that movies had sounded the death knell for live theater, but today the latter enjoys just as much, if not more, prestige than it did 100 years ago. I recently had the good fortune to see Kevin Spacey’s production of Richard III. I’ll remember his performance for the rest of my life — it had never occurred to me that acting could be so visceral, so violent, so physical. How many of us can say the same thing about movies? Again, those who foretold the demise of live theater never reckoned that people might just plain like seeing living bodies move around and speak on the stage, and that no amount of special effects could compensate for the lack of real flesh and blood.
3. The myth of social media. This myth holds that virtual, online or technologically mediated interactions are in the process of replacing face-to-face interactions. Most people never take the time to think about what the world would be like if this were really the case. I live in a small college town, and I can assure anyone interested in such things that student interactions on Friday and Saturday nights are plenty physical —sometimes I can hear them from across the lake! Social media does little more than provide a way of sharing information that enhances the intimacy of eventual physical contact. Anyone who doesn’t know this doesn’t understand the technology.
Of course, people like Harden will point to other sectors of the economy where technological innovation has erased thousands of jobs. People don’t need information from stockbrokers or travel agents to make decent decisions about travel or investment anymore, so why should a living, breathing professor be necessary to convey the sort of information one gets out of a college education? If that information can be distributed more cheaply thanks to virtualization, why should students be expected to bear the extra expense of classroom education?
The answer to this question is so elementary that the objection supporting it is almost hard to take seriously. The truth is that education is not simply the conveying of information. In fact, it is probably only marginally that. How many people remember most of what they learned in college? Only very few, I would guess. The benefit of a classroom education is that it keeps students under a certain amount of mental pressure, forces them to think on the spot, and obliges them to explain themselves to other people who are physically present. Information is afoot in these interactions, but so are wisdom, passion, empathy, and a whole host of other viscera that only an embodied teacher or student can properly convey.
How effective, for instance, do we imagine an online church experience would be compared to the real thing? Is it reasonable to think that a virtual tour of the cathedral at Chartres would be as spiritually moving as being there? We should also consider that many students might simply enjoy the physical classroom and their interaction with peers and professors -- or at least they might recognize that they learn better under these conditions. The costs of classroom education may be soaring out of proportion at present, but this is not a verdict on the education itself.
So let’s ask -- what developments are behind these grim augurs of the collapse of America’s higher education model? Some of it undoubtedly has to do with politics. Many commentators on the right (and perhaps Harden is one of them) would likely cheer the dismantlement of a system whose values are often perceived as far left of center. If taking education online can put “tenured radicals” out of work, then why not welcome it? At the same time, however, just as many moderate and left-leaning thinkers have joined the chorus of those predicting the failure of higher education (for instance, see Thomas Friedman’s recent writings in The New York Times), and it would be simplistic to chalk this latest round of doom-peddling up to politics.
The real culprit, I suggest, is what, for lack of a better term, we might call Appleism. Innocent in principle but nefarious in practice, the doctrine of Appleism holds that increases in technological capability are synonymous with increases in human happiness. Anything that can be put on a screen is better than what can be seen with the naked eye. The passage of electrons through a cathode tube is equivalent to passage from a lower to a higher state of being. Proponents of Appleism hold out technology as an intrinsic good; they are the sorts of folks who compulsively buy the latest Apple product, simply on principle.
We can point to fiscal insolvency all we want, but one has difficulty believing that Harden’s and others’ vision of a fully or almost-fully online education is not also the product of society’s limitless fascination with virtualization. Proponents of the current craze ought to think carefully about the human costs of technology before enthusiastically proclaiming the end of a system that could leave hundreds of thousands of people without work, students cheated out of a quality education, and that would further contribute to the creation of a world where virtualization is always and everywhere, without qualification or questioning, heralded as an unequivocal good.
Louis Betty is an assistant professor of French at the University of Wisconsin-Whitewater.
If you are an enrolled student and have any questions, feedback, bug reports, suggestions, or any other issues specific to a particular class, please post on the discussion forums of that class. This is the most effective way to get heard, as both the teaching staff and Coursera staff routinely check the discussion forums, and your fellow students will also be able to answer your questions and help you resolve technical issues.
—Coursera Support Center
anyone out there know whats going on in this Beckett play?
something about death
doesn’t get to the point fast enough…like prof winkler
i literally skipped winkler’s last 3 lectures, but with 1500 students whose gonna know?
You get out of it what you put into it.
that’s what she said!
CAN SOMEONE PLEASE ANSWER MY QUESTION?
does anyone think godot’s ever gonna come?
that’s what she said!! lol
he hasnt by the end of act 1
bet he doesn’t come at all. boy, will those two bums be disappointed!
lotta waiting in this sucker.
Know how long I had to wait on line at Best Buy on Black Friday?
look this isn’t the time
Ten hours. Ten fricken hours. But I got the TV!
then you should just watch the movie version
Waiting for Godot. can’t remember who stars in it, but it’s really slow. nothing ever happens.
Because nothing ever does. it’s like real life
Okay, I’m late to this discussion, but what’s the question?
What are those two guys waiting for, and why don’t they ever get moving?
Good question! Think it’ll be on the quiz?
Doesn;t matter. We can always cheat.
I’ll pretend I didn’t hear that.
Anyway its peer review. I’m grading your quiz.
You mean Winkler doesn’t look at them?
You kidding? He’s a prof at Harvard!
but he’s up there every week, talking to us
That’s just a video. Probably made that months ago.
Maybe he’s dead.
You mean like Godot?
its not about the plot, its like existential.
That’s some help.
It’s based on Freud’s trinitarian ego, id, and superego structure, asshole.
No, it’s all about the Cold War.
It’s Plato’s allegory of the cave.
I want an answer to my question. I’m going to e-mail Prof. Winkler.
You can’t. It comes back “addressee unknown.”
Really? Thats like so existential.
That’s it. I’m outa here.
They do not move.
Who typed that? Hey! Is that Prof. Winkler? Are you monitoring this? I wanna know, I wanna know! And is this going to be on the quiz?
David Galef directs the creative writing program at Montclair State University. His latest book is the short story collection My Date With Neanderthal Woman (Dzanc Books).
Submitted by Andrew Ng on January 24, 2013 - 3:05am
Educators create online courses for the same reasons that they became teachers to begin with: to educate students, broaden their awareness of the world and thereby improve the students’ lives. And with massive open online courses (MOOCs), educators can now reach many more students at a time. But MOOCs offer many other benefits to the education community, including providing valuable lessons to the instructors who teach them.
Online courses inherently allow students to create their own pathways through the material, which forces educators to think about the content in new ways. And MOOCs offer professors fresh opportunities to observe how their peers teach, learn from one another’s successes and failures and swap tactics to keep students engaged. This is, in turn, makes them better teachers.
MOOCs are still the wild west of higher education, and there is no “one size fits all” approach to building one. At Coursera, we’ve been working with educators as they experiment with designing courses for this new format, and for a student body of unprecedented proportions. (For example, Duke University’s Think Again: How to Reason and Argue by Walter Sinnott-Armstrong and Ram Neta has more than 180,000 enrolled students.) We’re reimagining many aspects of what it means to teach a course, ranging from lecture delivery, to assignments, to strategies for engaging the online community of students.
While there are many resources for teachers to learn from when approaching online education, we’ve become aware that there is still a need for a central space for professors to share successful practices, ask each other questions, and showcase examples of what’s worked and what hasn’t in their online classes. Recently, we launched a course called Teaching a MOOC, open to all of the professors on the Coursera platform (we’ll be launching a free, public version soon). It functions like any of the courses we offer, including video lectures that offer guidelines for developing an online course for the first time, discussion forums and a gallery where professors can see examples from other classes. And that’s just the tip of the iceberg.
An educator who’s been teaching in a traditional classroom format faces many challenges and unknowns when creating an online course. The lecture creation process is different. The peer-graded homework is different. The process for managing your “classroom” is different. Even the copyright law requirements are different. Jeremy Adelman, a Princeton University professor who teaches A History of the World Since 1300, explains, “When you lecture into a recording box, it’s different from lecturing to students in person. I have a teaching style that relies on energy from students, and I had to figure out strategies that would transcend [that style] for my class on Coursera.”
Adelman discovered that in putting his course online, he became more focused on what students are experiencing, even though he wasn’t in direct contact with them. “When I lectured, I had to ask myself at all times ‘What is it that I want my students to learn?’ In the old-fashioned lecture hall I was an entertainer, more self-focused rather than teaching-focused, but I was not conscious of this dynamic until I put a course online for the first time,” he says. “For me, the lectures alone were a source of continuous learning and adaptation.”
Throughout the entire MOOC creation process, educators must constantly be student-focused, figuring out what is the most useful content for their students to experience next. With no admissions office, online students are vastly more diverse than the students in a typical college classroom. They vary in educational background, learning ability, and culture. Students are also at different points in their life, and range from teenagers to working professionals to retirees, and may have different learning goals. Educators have to make classes accessible without underestimating student ability.
Stanford professor Scott Klemmer was pleasantly surprised by his experience teaching a Human-Computer Interaction course. His class was the first to use peer grading (in fact, he worked with Daphne Koller and me to design Coursera’s current peer assessment system). After using self-assessment for six years in his class at Stanford, he thought there was “no way” that he could expect students to handle self- and peer-assessment online.
“But it worked amazingly well,” Klemmer explains. “When we surveyed students at the end of class, one of the things they rated highest, in terms of what taught them the most, was the act of assessing peers -- they found it extremely valuable. I put a huge amount of time into designing course materials based on rubrics and assessment techniques that I taught in my Stanford class on campus; I had no idea what it would mean to translate that into the online world.”
There has always been a tendency in distance education to focus on the physical barriers -- the distance between the professors and the students, and between the students themselves. Many people, including those in academia, believe there to be a broadcast quality to online lectures, with one person delivering lectures to students behind screens, where they can’t engage directly with the professor. They wonder, “If the professors don’t see their students, how can it be teaching?”
But through today's technological advancements, online courses are very much alive. They are part of an ecosystem that, if nurtured through community discussion forums, meetups, e-mails, and social media (like Google+ hangouts), can flourish and grow. This allows each class’s community to take on a life of its own, with a distinct culture that’s defined at least as much by the students as the instructor, and which even skillful instructors can only guide, but not control. Nearly every instructor that I’ve spoken to has been surprised by the deep desire of students to connect with each other as well as with the teaching staff and professor.
University of Michigan professor Eric S. Rabkin found his experience teaching Fantasy and Science Fiction: The Human Mind, Our Modern World incredibly enriching. “I had not anticipated the kindness and excitement I see in this large body of participants. Despite the potential for impersonality, I have received emails of thanks, of enthusiasm, of discovery. I have replied to some of those and some of my replies have been re-posted to the forums by the recipients. The community knows I care and, at first astonishingly to me, cares back. They care enough not only to spend time with each other but to share their experiences, some even through blogs of their own, with the wider world,” he says. “Amazingly, this feels somehow like a family. Not like a nuclear family, but like a suddenly discovered distant city brimming with eager cousins one had never known before.”
“I have been [teaching] the same way for years -- for decades and decades -- without being mindful of the changes in technology, the changes in our students. Online courses blow up the old conventions. But I think it will take us a while to figure out what works and what doesn’t work,” says Princeton professor Jeremy Adelman.
University of Pennsylvania professor Al Filreis, who teaches Modern & Contemporary American Poetry, says that teaching online has given him his “most extraordinary pedagogical experience” in 30 years of teaching. “The course is rigorous and fast-paced, and the material is difficult, but the spirit of curiosity and investigation among the students produced very good results,” he says. “Several eminent poetry critics joined the course to rate the quality of the students' critical writing and came away very impressed -- and surprised. We discovered that a qualitative, interactive humanities course can indeed work in the MOOC format."
With MOOCs, there is so much more potential for educators to go into each other’s classrooms and share resources with their peers. We’re seeing this happen more and more, especially when it comes to professors adapting online course structures from other professors.
“Online education means that I have shared more stories with fellow professors about teaching than I had in the eight years I’ve spent teaching on campus,” says Stanford professor Scott Klemmer.
We might not have an answer to the question “What defines a high-impact MOOC?” just yet, but universities and professors who have taken the plunge are constantly learning and growing from their experiences. And what we’re seeing emerge from the trenches is an exciting new breed of education.
Andrew Ng is a co-founder of Coursera and a computer science faculty member at Stanford University. He is also director of the Stanford Artificial Intelligence Lab, the main AI research organization at Stanford. In 2011, he led the development of Stanford University's main MOOC platform, and also taught an online machine learning class that was offered to over 100,000 students, leading to the founding of Coursera. Ng's goal is to give everyone in the world access to a high quality education, for free. His Twitter handle is @AndrewYNg.
The rush toward the creation of massive open online courses (MOOCs) is catching on in higher education like wildfire. All it takes, it seems, is to wave a bit of money around, talk up the brave new world of technological innovation, bash the “failed” world of higher education as we know it, and the privatization troops have administrators in a fit of unexamined, swooning technophilia. These “courses,” however, in addition to offering false promises, also undermine shared governance, run roughshod over established curriculum development procedures and move colleges toward the era of “teacherless classrooms,” which destroy the academic integrity of our institutions and demean the value of the education our students receive.
MOOCs are designed to impose, not improved learning, but a new business model on higher education, which opens the door for wide-scale profiteering. Public institutions of higher education then become shells for private interests who will offer small grants on the front end and reap larger profits on the back end.
At present, MOOCs are being proposed as solutions to enrollment shortages, among other things, in open-access institutions such as community colleges. The MOOC crowd promises cost savings, efficiency, improved access and the answer to our “completion” woes. The concern as voiced by Arne Duncan himself is that in our quest to increase completion, maintain quality and save money: “The last thing we want to do is hand out paper that doesn’t mean anything.” Wethinks he doth protest too much.
And that’s the big lie behind this allegedly noble quest to provide much broader access to higher education and improve student learning. There is not a bit of proof that MOOCs will do so in any meaningful way. The notion is to turn community colleges into Petri dishes for MOOC experiments, principled objections be damned. There are costs to cut in the public sector and dollars to be made in the private sector.
The much-hyped arrival of MOOCs has been made possible by the Bill and Melinda Gates Foundation and a host of the usual corporate education reform suspects, who have long been involved in a full-court press propaganda campaign for their venture/vulture philanthropy.
Some of these interests are trying to figure out schemes for monetizing MOOCs in such a way that the small percentages of students passing MOOCs in cyberspace would pay institutions for certificates of competency awarded for completion of prescribed course regimens. Indeed, colleges and universities conceivably might even cash in further by recommending the most successful students to corporate interests … for fees.
Critics, meanwhile, are easily dismissed as part of the corrupt old world of failed higher education, troglodytes as afraid of this bold new magic as cavemen were of fire. And to the consternation of the self-proclaimed “change agents,” those reactionary faculty shielded by union contracts and powerful academic senates stubbornly resist the next new wave. Never are the implications of the MOOC offerings — typically announced with fanfare — outlined with respect to faculty and classified staff workloads (e.g., registering students and setting up and maintaining the technology infrastructure for individual course sections which, in some cases, have enrollments in excess of hundreds or even tens of thousands of students). Students will grade each other, or course work will be evaluated through word recognition computer software programs. Faculty, the promoters tirelessly stress, just must stop lecturing, instead becoming “facilitators” for student engagement in experiential education. And what of the student support services? Or, perhaps, in this idyllic (or should that be dystopic?) educational space, those needing support are just left out in the cold, after corporate partners first have made their millions though software sales.
In the San Diego Community College District we have dared to step in front of the vaunted train of progress that many of us see as nothing more than a repackaged Taylorism for academia. The San Diego City College Academic Senate recently passed a resolution decrying the move toward MOOCs. The resolution followed on the heels of a faculty presentation at the San Diego Community College (SDCCD) Board of Trustees meeting in response to administrative attempts to circumvent the departmental and collegewide shared governance process so as to rush through grant applications for MOOCs at both City and its sister college, San Diego Mesa College, before any campuswide discussion had occurred. This resulted in Chancellor Constance Carroll declaring a one-year moratorium on MOOCs in the SDCCD while a task force investigates the appropriateness of this new form of instruction for our district.
In our view, the central philosophical flaw in the MOOC paradigm is that proponents believe that there is nothing to be lost in turning professors into glorified tutors, parts of a larger information delivery system. What this misses is the key fact that the heart of what we do as college educators has to do with the immeasurable human interaction that we have with our students and the vital social experience of the face-to-face classrooms. This is something that simply can never be reproduced by a new technology, no matter how advanced.
Demoting professors to the level of information delivery systems may be gratifying to our detractors and financially attractive to bean-counters but it won’t improve education in the process of “transforming it”; it will degrade it. But to the academic Taylorists, who don’t believe in anything that can’t be quantitatively measured, this kind of thinking is destined for the dustbin of history.
No doubt the brave new world of MOOCs will give lots of people who can’t go to Harvard access to “Harvard,” but it won’t be same. Indeed, the future of higher education will be less egalitarian and far more two-tiered with the sons and daughters of the elite still receiving real top-quality educations while other folks will get something different, quick, cheap and easy.
But this tale of two futures is perfectly in line with the thinking of the plutocrats who brought us the “productivity revolution” in the business world. There they got a smaller number of American workers to labor longer hours for the same money and fewer benefits while increasing productivity and bringing record profits for those at the top. In the realm of higher education, they can blame the colleges for the fact that fewer graduates are prepared for employment in the austere marketplace that they fostered while milking our schools for profit and transforming them to their purposes at the same time. It’s nice work if you can get it.
In the meantime, our job as professors, according to the dictates of the emboldened technocrats, is to become rope-makers for our own professional hangings. The debate here is not really one about technology and higher education, as most of us know that online education is now a permanent part of the educational landscape with legitimate uses. No, what this MOOC debate is about is whether we blithely open the door to the gutting of what is most precious about what we do.
If the unthinking technophilia and new Taylorism which MOOCs represent ends up killing face-to-face education as we know it, it won’t be because the technology offers a superior form of education. It will be because our visionless political and educational leaders have almost entirely abandoned educational values for market values. As many scholars have noted, in the era of neoliberalism we have just about given up on the notion of education as a public good rather than a mere commodity. Let’s hope we don’t allow this near-total triumph of market values to destroy one of the last public spaces in our society not completely determined by greed and instrumentalism. As opposed to the creed of the forces of privatization, we believe that there are still things whose value cannot be determined by the market and that education in a democratic society should be much more than an instrument of our economic system.
Six community college faculty members
Jennifer Cost is chair of English department at Mesa College.
Jim Miller is professor of English at San Diego City College.
Jonathan McLeod is professor of history at San Diego Mesa College.
Marie St. George is professor of psychology at San Diego City College.
Peter Haro is president of San Diego City College Academic Senate.
Jim Mahler is president of the American Federation of Teachers for the San Diego and Grossmont–Cuyamaca Community College Districts.
“The best vision is peripheral vision.” – Nicholas Negroponte
Back when I was a carefree grad student, some 20 years ago, I decided to write a dissertation about apocalyptic discourse. The millennium was then looming on the horizon like a Mayan baktun or a disruptive innovation, threatening to bring about the end of the world as we knew it. In one corner stood those who confidently predicted that a comprehensive desolation would be visited upon everything that we’d once held sacred, and in the other corner stood those for whom the clanging bell of the millennium would most certainly signal the restoration of a profound peace and a lasting illumination.
Who could resist stepping into a fight like that?
Not only did it sound like fun to tell them they were both wrong, but it also seemed fundamentally true. I felt pretty confident, for a start, that the world wouldn’t really end.
I feel much the same today when I confront yet another breathless news story about whatever latest innovation (hint: it’s always a MOOC) is going to change higher education utterly, for better or worse, full stop. Except it doesn’t, and they never do – film didn’t change education utterly, television didn’t, the computer didn’t, and the Internet hasn’t. After all, I write this to you from the cozy corner of a major urban research university; the old ways are still very much with us, even as we make way for the new.
While I don’t want to lay the blame for all of this apocalyptic rhetoric at the doorstep of Clayton Christensen, I would still like to have a word.
Don’t get me wrong, I’m a longtime fan of online learning (as well as classroom learning; I’ve studied and taught in both environments, and each has its strengths). I also like change as much as the next person – whether it’s the sun coming out in December or a shiny new iPhone. All of these things have something to recommend them. But do any of them portend the end of time? I don’t think so.
I read Christensen’s Innovator’s Dilemma with real interest back in 1997 when I was an IT market analyst. I even reviewed it for a trade newsletter. I think his theories of “sustaining innovation” and “disruptive innovation” offer interesting tools for understanding how a variety of industries have evolved at particular moments in the past.
Whether these theories can be used to predict the future, however – well, I’m not so convinced.
For example, I was surprised to read the following prediction from Christensen’s Disrupting Class in 2008: “by 2019, about 50 percent of high school courses will be delivered online.” I already knew back then that the penetration of online learning in higher education was nowhere near approaching that level, and I also knew that higher education was far advanced in its experimentation with online learning relative to high schools. While Christensen and his co-authors had some interesting mathematical models to draw upon to support their prediction, common sense and recent history seemed to suggest that the revolution probably wouldn’t come as early as they’d anticipated. In fact, if the prediction turns out to be true in six years’ time, I’ll eat this article.
My real beef, though, isn’t with Christensen. He’s a smart man, even if he’s still something short of a prophet. My real problem is with the acolytes – and I urge any young readers out there considering a dissertation of their own on apocalyptic discourse to keep an eye out for these types. Acolytes tend toward reductivism, simplification, and speaking very loudly. And often they mangle the prophet’s core message in the process, occasionally even inverting it.
Take disruptive innovation. Please.
In higher education, at least judging by the recent conferences I’ve attended, far too many people have come away from Christensen’s work (or whatever second- and third-order echoes of it they’ve picked up from the media) with the idea that, if we all try, we can simply disrupt ourselves. And that way, nobody has to lose a job or a research grant or move back to the home office.
Among this strand of believers, “disruptive” innovation appears to be synonymous with “cool” innovation, or even simply “change” – or even, more simply, “the status quo.”
What these believers forget, or perhaps never knew, is that Christensen uses the concept of disruptive innovation as a means of describing how the giant company is so often killed by the little guy with the sling shot – a sling shot that just happens to be cruder, easier to use, less expensive, and more attractive to a heretofore unengaged set of new consumers than the giant’s weapon of choice. In other words, if genuinely disruptive innovation does occur within higher education, traditional universities are much more likely to play the role of the giant than the innovator.
And yet some of the more attentive readers of Christensen’s work have taken to heart his hopeful message that the only way for incumbent leaders to survive these market disruptions is to create new and separate business units of their own, free from profit pressures and growth strategies of the core business, and allow them to break all the rules en route to coming up with “the new, new thing” that will prove to be the true category-killer. If any institutions within higher education succeed at disrupting themselves, it may be the few that have adopted this model. But for those institutions working desperately to preserve the rules and still somehow survive in a dynamic market – the ending may not be the one they expect.
In the months ahead, I’d like to use this column to reframe and refocus the conversation about innovation in higher education. All of this talk about disruption has become a distraction – an apocalyptic tick. It reminds me of that great line from Tolstoy, “He in his madness prays for storms, and dreams that storms will bring him peace.” Let’s leave that aside for the time being and look at productive change in higher education from a different vantage point.
“The best vision,” Nicholas Negroponte likes to say, “is peripheral vision.” New ideas are out there – in the margins, away from the main frames of reference. Their immediate effects may be small or local in scale, but they can gradually introduce meaningful improvements to mainstream practices. In this column I want to examine some of the interesting experiments taking place in the margins of our field of vision – experiments that may well inform how we refine our approach to delivering higher education going forward.
Along the way, I’d like to propose that we focus on a humbler but still worthwhile form of innovation – the kind that isn’t dependent upon hype, gadgetry, a singular eureka moment, or the game-changing end of all that came before it. I’d prefer to focus on the kind of innovation characterized by a planned and responsible approach to strategy and management, the kind that continuously seeks to transform products and services in ways that are more responsive to the evolving needs of the contemporary marketplace, and which, as a consequence, delivers enhanced benefits to all participants, whether they be students, faculty, administrators, parents, governments, or the public.
And let’s assume the world is still there with us, too.
Peter Stokes is executive director of postsecondary innovation in the College of Professional Studies at Northeastern University.
Rather than having students wait weeks for feedback on homework, MIT professor has developed computer program that assigns diverse group of people to review small chunks of each student's work. MIT may use program in MOOCs.