There’s a legendary story about Anne Sexton’s learning how to write a sonnet by watching I.A. Richard’s educational-television series in the late fifties. I’ve thought about that fairly often while reading the daily stories on MOOCs. In the Sexton/Richards instance, there was a fortuitous electronic meeting of an excellent teacher who saw possibilities in the then “new” technology of television and a motivated student who was ready to write as if -- and according to her this was indeed the case -- her life depended on it.
That hyperbolic tone of the last sentence above -- a tone that readers of Sexton’s later poems and interviews are already familiar with -- is also the tone of a good many declarations about MOOCs.
Thomas Friedman’s latest column “The Professors’ Big Stage” is a case in point. His piece on “the MOOCs revolution” is riddled with contradictions, shallow thinking -- and an error in basic arithmetic.
Friedman begins by excitedly informing us that he’s just returned from a “great conference” sponsored by M.I.T. and Harvard on “Online Learning and the Future of Residential Education.” He doesn’t explain why he had to attend in person, or question why the conference wasn’t online, but he adds his own title, “How can colleges charge $50,000 a year if my kid can learn it all free from massive open online courses?" That premise, it soon becomes clear, is moot.
More on Friedman and MOOCs
"Thomas Friedman has as much
credibility on education as I do on
dunking a basketball," writes
As Friedman goes on to extol the virtues of using MOOCs as supplements for traditional courses and programs, MOOCs then become an example of preliminary programmed learning -- the sort of thing that community colleges have been doing in terms of remedial aid for quite a while. Publishers like Bedford/St. Martin’s have offered online drills for years. And if the MOOC is tied to an accredited college’s course, then Junior and his dad are still paying for Junior’s education.
According to Friedman, students enrolled in a hybrid course at San Jose State, which combines M.I.T.’s introductory online Circuits and Electronics course with traditional in-seat class time, have done quite well: “Preliminary numbers indicate that those passing the class went from nearly 60 percent to about 90 percent.” There’s even better news for the students involved in that course than Friedman’s assessment: he sees the improvement as one-third; in fact, a jump from 60 percent to 90 percent means the number of students passing the class increased by one-half, or 50 percent.
We should note that this is an argument for remedial preparation and/or immersion in a subject -- not necessarily an argument for online versus in-seat instruction.
And that, of course, is just one class. Friedman sees MOOCs as going far “beyond the current system of information and delivery -- the professorial ‘sage on the stage’ and students taking notes, followed by a superficial assessment. This description not only fails to describe adequately the current system but also ironically illuminates some of the biggest problems with MOOCs. Given the scale of MOOC courses, the only kinds of student assessment that can be accomplished are superficial. And we will have to hope that some enrolled students, unlike Friedman, still believe in note taking. The MOOC lecture system, however, puts that sage right back on the stage -- as Friedman’s very title for his op-ed indicates.
Moreover, his discussion of Michael Sandel, the Harvard professor whose Justice course will have its American debut on March 12 as the first humanities offering on the M.I.T./Harvard edX online learning platform, focuses not on aspects of the course but on Sandel’s old-fashioned appearances on the lecture circuit.
Sandel, whose course has been translated into Korean and shown on national South Korean television, recently traveled to Seoul (again, why?), where he lectured “in an outdoor amphitheater to 14,000 people, with audience participation.” There was no indication as to how long the Q&A session ran.
Academicians often fall prey to magical thinking; at my former college, each time we hired a new provost (10 in my 16 years), we were certain that this was the one who would be our savior.
Each time we created a new central curriculum (three in my 16 years; the final stage just before I left was to exempt adult students from completion of the college’s core requirements), we were certain that this was the answer. Smaller, struggling colleges may see offering licensed supersized online courses as cost-saving -- an escape from the situation they currently find themselves in, in which every small school worries about going online or bust.
Many of these colleges turned to creating their own individual online courses -- already being referred to as “traditional online courses” -- as a solution, only to find that the expenses have outweighed the successes: they are costly in terms of faculty training, serve very small audiences (often sitting only a building or two away), and put severe strain on IT departments.
Online consortiums in which struggling schools have banded together have also proved to be problematic; I am thinking in particular of one class that I was asked to review for my former college, which was a member of such a consortium: an accelerated multi-genre writing class, which asked students to write one poem, one short story, and one essay over a period of five weeks. The "final project" consisted of one additional work, in the students' choice of genre. It was thus possible to fufill 50 percent of the course requirements with two haiku.
MOOCs, of course, have their ur-versions, which include not only Henry Ford’s production line and the rise of fast food, but massive online delivery experiments in the mid-1990s, online remedial drills, large introductory-course in-seat lectures, Sunrise Semester, and the Great Lecture Series, but also the 19th-century lecture. And possibly there was someone who asked Harvard for credit for attending Thoreau’s lecture on “Society” -- or for attending a lecture by P. T. Barnum.
Friedman does note, near the end of his exhortatory column, that “We still need more research on what works.”
Indeed. Along with the return of the sage on the stage, this newest educational/industrialized development has brought along with it -- no surprise to anyone who has taught a traditional online class, a class with online components, or a traditional in-seat class -- some old concerns: problems with technology; problems with underprepared and unmotivated students; problems with class participation in discussions (one sage walked off the stage); and concerns about retention and plagiarism.
Assessment will continue to be one of the biggest concerns: both assessment of the overall course and assessment of any student work that goes beyond the level of a drill. Financial issues will come in to play, as will work force issues. Hierarchical divides among students, faculty members, and institutions will not disappear.
Finally, there is a dynamic in a traditional classroom that MOOCs simply can’t provide. In small, in-seat courses and workshops, students discover that they are part of a community, in which each person has a responsibility to contribute and the reward of personal interaction. Such courses allow for flexibility, Socratic questioning, and serendipity. Face-to-face meetings and small-group dynamics are important parts of education and socialization. And they provide an essential break for students from their hours of online gaming, posting and browsing.
One other analogy that comes up in discussions of MOOCs is “correspondence course.” It’s considered a dirty term, and yet, it may be an accurate description as thousands of students and piecework adjuncts labor at their solitary tasks.
And there may be something to be learned from a fictional account of a correspondence school: J. D. Salinger’s “De Daumier-Smith’s Blue Period.” The alienated protagonist concludes that “We are all nuns” -- working silently, separately, seeking salvation.
Carolyn Foster Segal is a professor emeritus of English at Cedar Crest College. She currently teaches at Muhlenberg College.
The top of the annual performance review form at my university has a blank space for us to list any additional education we obtained during the previous year. I’ve never filled that space in before, but that will change in my review for 2012 because I spent part of my sabbatical last fall as a student in a massive open online course (or MOOC).
I'm an American historian by training, but ever since I left graduate school a global perspective has become increasingly important for historians of all kinds. That’s why I decided to get some free professional development in world history, courtesy of Coursera. I learned a lot of interesting and useful specific factual information from the MOOC instructor (or superprofessor, as the lingo goes) that has already helped me become a better teacher and scholar.
But I didn’t just listen to the lectures. Like any other student (since that’s what I was), I also wrote out all the assignments and helped grade papers written by my peers in class. This peer grading process differs from peer evaluation (which I use in class all the time) since students not only read each other’s work, they assign grades that the course professor never sees. Professors in the trenches tend to hold their monopoly on evaluating their students’ work dearly, since it helps them control the classroom better by reinforcing their power and expertise. On the other hand, superprofessors (and the MOOC providers that teach for them) have begun to experiment with having students grade other students out of necessity since no single instructor could ever hope to grade assignments from tens of thousands of students by him or herself.
With MOOCs in their infancy, few precedents exist for designing online peer grading arrangements for humanities courses. For this reason, I don’t intend to criticize my superprofessor’s choices here. However, I do have to describe some of the peer grading process from my class in order for my critique of peer grading in general to make sense. All students in the MOOC were supposed to write six essays between the start of the course and its end. For each assignment, we could choose one of three single-sentence questions to answer in 750 to 1,000 words. The week after we submitted those essays, we were supposed to grade the essays of five of our peers with respect to their argument, evidence and exposition, and leave comments. If you didn’t grade the essays your peers wrote, you didn’t get to see the grade you earned.
With respect to the grades I earned, I think my peers graded my essays just right. The grading scale in our MOOC went from zero to three. When I already knew a fair bit about the topic of the question that I answered or I tried very hard to write the best essay I could, I earned mostly threes from my peers. When I didn’t try very hard, I tended to get twos. While I listened to all my superprofessor’s lectures fairly closely, I never read the recommended textbook, which also undoubtedly hurt my scores.
For me at least, the primary problem with peer grading lay in the comments. While I received five comments on my first essay, for every subsequent essay I received number grades with no comments from a minimum of two peers and as many as four. In one case, I got no peer grades whatsoever. That meant that the only student who evaluated my essay was me. Every time I did get a comment, no peer ever wrote more than three sentences. And why should they? Comments were anonymous so the hardest part of the evaluative obligation lacked adequate incentive and accountability.
I read in The New York Times a few weeks ago that a study had begun to examine whether peer grades would match the grades assigned by professors and teaching assistants in one sociology MOOC. While that would prove an impressive feat if true, it would in no way validate the process of peer grading. Learning, as any humanities professor knows, comes not through the process of grades but through the process of students reading comments about why they got the grades they got. That’s how students find out how to do better next time.
To be fair, the course included a good set of instructions about how to grade a history essay linked from the course homepage. Unfortunately, there was no way for the superprofessor to force students to read those instructions, and due to the inevitable pressure to cover as much world history as possible, he never discussed how to grade in any of the class lectures. How could he? Good grading technique is difficult enough for graduate students to learn. Because of the size of the course I think I can safely assume that many of my fellow MOOC students inevitably had no history background at all, yet the peer grading structure forced them to evaluate whether other students were actually doing history right.
The implicit assumption of any peer grading arrangement is that students with minimal direction can do what humanities professors get paid to do and I think that’s the fatal flaw of these arrangements. This assumption not only undermines the authority of professors everywhere; it suggests that the only important part of college instruction is the content that professors transmit to their students. How many of the books you read in college can you even name, let alone describe? It’s the skills you learn in college that matter, not the specific details in any particular class, particularly those outside the major.
Over the course of my career, I have increasingly begun to spend much more time in class teaching skills than I do content. Some of this has been a reaction to encountering students who do not seem as prepared for reading or writing college-level material as the students I had back when I started teaching. However, I have also come to believe that teaching these skills is much more important than teaching any particular historical fact. After all, it really is possible to Google nearly anything these days.
Certainly good students can do a good job grading peer essays and I got a few short but insightful comments on the papers I wrote for my MOOC. Even if all of my comments had been less than helpful, I didn’t come into the MOOC process seeking to improve my writing skills. I wanted to learn new information, and many other students who engaged the material the same way that I did probably felt the same way.
Students like me won’t be the ones who’ll suffer because of peer grading. Its victims will be the future students who take MOOCs to earn college credit at increasingly cash-strapped universities. Who will teach them how to write well? Who will monitor their progress through the peer grading assignments? Who will help them understand that history is as much about argument as it is about facts or that literature can be appreciated on multiple levels? While other students can certainly teach other students some things, they can never teach students everything that a living breathing professor can.
Education startups like Coursera are experimenting with peer grading not because it is the best way for students to learn history or English, but because it is the only way that the MOOC machine can ever run itself in a humanities course. If MOOCs incurred high labor costs the same way that colleges do, those startups would never be able to extract a profit from those classes. While that’s a legitimate concern for Coursera’s venture capital investors, everyone else in academia – even the superprofessors – should give more weight to purely educational concerns.
“The fruit ripens slowly,” the Guru Nisargadatta Maharaj once observed, “but it drops suddenly.”
In a similar fashion, MOOCs (or massive open online courses) seem to have arrived almost out of nowhere, in quick succession – first Udacity in February of last year, followed by Coursera in April, then edX in May. Remarkable as it may seem, MOOCs as we know them today have been with us only for as long as it has taken the Earth to make one orbit around the sun.
“I like to call the last year ‘the decade of online learning,’ ” joked Anant Agarwal, president of edX, during my recent visit to the offices of his bustling startup in the Kendall Square area of Cambridge, Mass.
As accelerated as the progression of MOOCs has been from curious acronym to household name, and as much as it may seem that MOOCs themselves have fallen from the sky, in truth MOOCs have been ripening for some time.
Consider the free “courses” delivered through iTunes U for the last several years, or TED Talks, and Khan Academy, not to mention some of the early progenitors of MOOCs themselves, including Dave Cormier, credited with coining the phrase in 2008, as well as George Siemens, Stephen Downes, Alec Couros, David Wiley, and others.
Recall Carnegie Mellon’s Open Learning Initiative, the “open educational resources” movement, and MIT’s OpenCourseware, launched all the way back in 2002. And let’s not forget Fathom.com, an initiative out of Columbia University launched at the turn of the millennium, or even the early days of America Online and Compuserve, both of which offered educational content through their services as early as the 1990s.
MOOCs, then, are not as new as they seem – though the world today appears to be more ready for them than it was in decades past. Indeed, it isn’t hard to see how forces as diverse as Clayton Christensen’s theory of “disruptive innovation” from the late 1990s, the expansion of online enrollments over the last decade, the reformist intentions of the Spellings Commission on the Future of Higher Education from 2005-2006, the great recession of 2007-2009, or the completion agenda supported by the Lumina and Gates Foundations over the last few years have all contributed to a public thirst for what look like very high-quality educational offerings at very low – or even zero – cost.
“I also call the last year,” Agarwal added, “ ‘the decade of innovation.’ ”
And like many innovations before them, MOOCs have been received with the usual contradictory apocalyptic fervor – where some believers foresee the arrival an educational golden age and others see the eventual destruction of our institutions, our faculty, and the intangible value of face-to-face learning.
Writing in The American Interest this month, for example, Nathan Harden claimed that “ten years from now Harvard will enroll ten million students." He went on to argue that as a result of the MOOC movement, “the changes ahead will ultimately bring about the most beneficial, most efficient and most equitable access to education that the world has ever seen."
At the other end of the apocalyptic continuum, Gregory Ferenstein, writing for TechCrunch last month, foresaw a future in which MOOCs wreaked a terrible devastation on the land, as “part-time faculty get laid off, more community colleges are shuttered, extracurricular college services are closed, and humanities and arts departments are dissolved for lack of enrollment.”
The real significance of MOOCs lies, however, not in their being a harbinger of our educational salvation or demolition. Nor does their real significance lie principally in their potential to increase access or reduce costs – at least not for Agarwal and edX.
“We are about two things,” Agarwal told me. “We are about dramatically increasing quality and impacting campus learning. We are being very deliberate. This is not a numbers game – this is not a game at all. This is a quality quest.”
Funded with $60 million in seed capital from MIT and Harvard, edX can make a claim to being the first MOOC platform to market, inasmuch as its predecessor, MITx, was launched in December 2011. Until this week, the edX consortium featured five independent member institutions (MIT, Harvard, the University of California at Berkeley, Georgetown University, and Wellesley College) and one state university system comprising 15 colleges and universities (the University of Texas System). Thursday, it added six more, including several outside the United States.
In less than a year, edX’s 25 courses have enrolled close to 700,000 people. “That’s more than the combined alumni of MIT and Harvard over their combined 500-year history,” Agarwal observed with a mixture of pride, enthusiasm and amazement. What really pleases him, though, is something else.
Rolling his chair across the office, Agarwal waves me over to his monitor and shows me the virtual laboratories edX has been developing for its courses. We start with his own course on Circuits and Electronics (6.002x in the edX course catalog).
“Many MOOCs are just about analyzing problems,” he said. “We give you a blank sheet of paper and say, ‘Go build, design, create, construct something.’ ” With drag-and-drop alacrity, Agarwal moves the components of a circuit into place on a piece of digital graph paper and clicks a button to test its performance. “Computers do the grading,” he said, “in real time.”
“The media focus on numbers, they focus on cost,” Agarwal sighed. “But they should focus on something else – quality. And they should focus on efficiency. What is efficiency? It’s a ratio of quality and cost.”
Agarwal knows that MOOCs have their doubters, and he believes that they can only be persuaded with proof. He cites the case of San Jose State University, which licensed his own course on circuits and ran it as an adjunct to the school’s own classroom-based instruction. The results, Agarwal claims, were impressive. “The fail rate dropped from 40 percent to 9 percent,” he told me. “That’s a quality improvement.” And the costs to San Jose State were minimal. That’s efficiency. Agarwal says San Jose will be sharing more details about their experience with edX in the near future.
With the avidity of the prototypical startup entrepreneur, Agarwal talked excitedly about the potential for MOOCs to improve pedagogy. “We have our xConsortium,” he said. “All of the schools in our consortium have access to all the data in the platform in an anonymized format. This is what I call ‘the particle accelerator of learning’ – big data in learning in real-time.” In a sense, then, edX’s quality quest, as Agarwal calls it, is seeking out the educational equivalent of the Higgs Boson, as well the other fundamental elements of learning, in order to better understand what kind of learning objects, what kind of real-time remediation, and what kind of learning materials – whether analysis or laboratory or other – produce the best results from one learning context to the next.
I ask Agarwal what distinguishes edX from its fellow MOOC platforms. “We have a fundamentally different mission,” he replied. “We’re nonprofit. We’re open source. Our technology is for everyone. And we have a commitment to campus learning.”
Earlier this month, the American Council on Education completed an evaluation of five courses on the Coursera platform, developed respectively by Duke University, the University of California at Irvine, and the University of Pennsylvania. Intriguingly, all five courses were approved for credit through the ACE credit transfer program. But just in case the future of MOOCs was beginning to make sense to you, consider this – all three of these institutions have made it clear that they, at least, will not be awarding credits for the courses, irrespective of the fact that they developed the courses themselves.
MOOCs are puzzling.
Will they last? It’s not, I suspect, a question that would bother Agarwal very much one way or the other. “For us,” he said, “it’s not about MOOCs. We are trying to reimagine our own campus. The lecture wasn’t working. Quality has been static for decades, but costs are going up. There’s a trillion dollars in student debt. We are trying reimagine campus education from the ground up – with new ways of learning that are more enriching, more engaging, more efficient, and that produce better outcomes.”
How do you like them apples?
Peter Stokes is executive director of postsecondary innovation in the College of Professional Studies at Northeastern University, and author of the Peripheral Vision column.
With great interest, I read the recent news announcing that the American Council on Education (ACE) had evaluated five Coursera MOOCs and recommended them for credit. But I had hoped for something different.
Having traditional prestigious institutions making their online content open to the world – of course without their prestigious credit attached – was an exciting development. A race to post courses ensued. On the surface, it’s an altruistic move to make learning available to anyone, anywhere for free.
Dig deeper and we are left to ask, how many MOOC courses will really be worth college credit, where will the credits be accepted, and for how long will college credits even be the primary measurement of learning?
Now that ACE has evaluated a few courses, MOOC providers will see how their process goes as students start actually finding proctors and taking tests -- or finding other methods of assessment -- to prove they learned the material. But a few courses will not be enough to really help students earn degrees, and with MOOC courses and providers continuing to proliferate, this does not seem like a viable way to keep up with demand.
Regardless, it is more than likely that the universities that agreed to the ACE CREDIT review are never going to accept an ACE CREDIT transcript themselves. The students with ACE CREDIT transcripts will need to present those transcripts to “lesser known” schools that are not among the elite players – colleges with much lower tuition and a willingness to serve post-traditional students.
More troubling is the fact that the ACE process for credit review is still course-based. Will this really be flexible enough in the future? Will it measure competencies and individual learning outcomes? Even if it seems scalable, will it mean all MOOC evaluations have to run through ACE and only ACE? Will students have to wait until ACE has evaluated a MOOC course before they can get credit?
Moreover, this raises the question: Are course evaluations and testing really the best or only way to deal with this new era of learning? What about experiential learning? If someone has college-level learning from their life experience is it invalid unless they take a course?
As Inside Higher Ed points out in its article, this was a fast move in an industry that moves at a glacial pace. But when ice really begins to melt, it can quickly turn into a waterfall. Students have more options for learning, and can get more information, from a variety of sources. So the question for education becomes, how can we best accommodate that?
I would assert that a portfolio assessment of students’ learning is the best way. Just as an artist shows a portfolio to a prospective employer, students should be able to demonstrate learning from wherever they have learned -- work, MOOCs, informal training, military service, volunteer service, and more -- all in one place. And much of this learning will not involve a course at all.
If MOOCs are to be truly disruptive, they must link to competencies, credentials, degrees and/or ultimately jobs. Using a course-by-course, credit hour-by-credit hour approach to do this will not dramatically change the way people earn degrees. And dramatic change that allows for individual demonstrations of competencies is the only way to provide the education quality and agility necessary to truly recognize learning derived from free resources on the web. By focusing on competencies, we can align and accept learning experiences from everywhere.
Pamela Tate is president/CEO of the Council for Adult and Experiential Learning.
Submitted by Ted Fiske on February 12, 2013 - 3:00am
Gimme an M! M!
Gimme an O! O!
Gimme another O! O!
Gimme a C! C!
What have we got? MOOC!
Far above Cayuga’s waters with its waves of blue,
Stand our noble M-O-O-Cs, glorious to view.
Massive Open Online Courses, loud their praises tell.
Hail O dig’tal Alma Mater, now called e-Cornell.
On, Wisconsin! On, Wisconsin!
Fight on for our MOOCs.
They make teachers into rock stars.
Who needs Yale or Duke? (rah rah rah)
We take classes in our jammies
Any time of day.
Oh, how we love to learn
The online way.
Cheer, cheer for old Notre Dame.
And for the MOOCs that bring us our fame.
Send a volley cheer on high
’Cause our instruction comes from the sky.
Though the attachments be great or small,
Our CPUs can handle them all.
Open access sets us free
To seek out an e-degree.
Don’t need classrooms, that’s for sure.
Libraries are so passé --
Remnants of another day.
We’re creating new tradition.
Ours is wireless erudition.
We eschew all printed words.
Rest in pace Gutenberg.
Edward (Ted) Fiske, former education editor of The New York Times, is author of The Fiske Guide to Colleges. Post your ideas for other college songs for the MOOC era here as comments or e-mail them here.
At 30 years old, I definitely consider myself part of the Facebook generation. Zuckerberg’s brainchild hit the ‘net when I was a senior in college, and by then I was already well acquainted with e-mail, chat rooms, text-messaging, and all the multifarious precursors to today’s social media. I text, I post, I chat, I even snapchat: in these respects, I’m an utterly unremarkable member of my society.
But I also happen to be a college professor and a molder of young minds. And, far from indulging the technology-driven spirit of the times, I make my students work as students have always worked. They read Seneca, Pascal, Tolstoy, and Schopenhauer. They are obliged to turn in papers by hand; they must come to office hours to speak with me about their grades; they are even, and this is most anachronistic of all, required to attend class. Physical presence is key to every aspect of their learning experience, be it my hovering, breathing presence in the classroom or the office, the cohort of 30 or so warm bodies that shows up for lecture twice a week, or the more abstract form of embodiment conveyed by the weight of a book.
To believe certain commentators, however, this embodied notion of learning is on its way out in American higher education. Writing for The American Interest’s January/February 2013 edition, the recent Yale graduate Nathan Harden offers the following ominous prognostications about the future of university instruction in our digital age:
In fifty years, if not much sooner, half of the roughly 4,500 colleges and universities now operating in the United States will have ceased to exist. The technology driving this change is already at work, and nothing can stop it. The future looks like this: Access to college-level education will be free for everyone; the residential college campus will become largely obsolete; tens of thousands of professors will lose their jobs; the bachelor’s degree will become increasingly irrelevant; and ten years from now Harvard will enroll ten million students.
On Harden’s account, one of the principal reasons for this portended transformation, which is already being partially implemented by such institutions as Harvard and MIT, is that the cost of college is increasingly out of proportion with its perceived economic benefit. As the American job market has become more competitive, the cost of a degree has increased, and only the most naïve of students still believe that a college education is a universally redeemable ticket to middle-class prosperity. The weighing up of costs and benefits involved in earning a college degree will lead inevitably to a re-evaluation of the current higher education model. Luxury residence halls, face-to-face interaction between professors and students, ivied brick walls -- these will all be things of the past once the much-heralded education bubble finally bursts. What will replace them are massively populated, inexpensive online courses and lectures, prerecorded by the very best lecturers and administered by those hordes of professors and other academics not quite sexy or charismatic enough to warrant virtual celebrity.
To anyone who thinks Harden’s predictions are a little too ambitious (not to mention deeply disturbing, at least for college professors who don’t fancy the idea of working in a grading factory), don’t worry -- they most likely are. What Harden forgets -- and indeed, what just about everyone prophesying the eclipse of face-to-face interaction in a virtual world forgets -- is that human beings are, above all else, bodies, and that to lead full, happy, and meaningful lives, we need other bodies. Let’s consider the following examples of how technologies of virtualization have failed to triumph over our species’ thirst for physical presence.
1. The Giant Head. Some older readers may recall a famous article in Reader’s Digest from the late 1950s featuring an illustration of a massive human head connected to miniscule arms and legs. What was the thesis of that article? The tech junkies of the time believed that in the future technology would become so advanced that human beings would no longer need to use their bodies, leading to a swelling of the brain and a shriveling of our appendages. Many also foretold a time when food supplements would replace food. Wouldn’t it be great, they asked, if instead of spending hours preparing and eating meals, we could nourish ourselves in just a few seconds? No one at the time seemed to consider that human beings might not want to do any of this — that we might enjoy using our bodies, eating, and the like. In the half-century since these predictions were made, restaurants have proliferated, and heads haven’t grown one bit.
2. Live Theater. When I was a kid, there were hardly any live theaters in my hometown of Bakersfield, Calif. Now there are about ten. Many people used to believe that movies had sounded the death knell for live theater, but today the latter enjoys just as much, if not more, prestige than it did 100 years ago. I recently had the good fortune to see Kevin Spacey’s production of Richard III. I’ll remember his performance for the rest of my life — it had never occurred to me that acting could be so visceral, so violent, so physical. How many of us can say the same thing about movies? Again, those who foretold the demise of live theater never reckoned that people might just plain like seeing living bodies move around and speak on the stage, and that no amount of special effects could compensate for the lack of real flesh and blood.
3. The myth of social media. This myth holds that virtual, online or technologically mediated interactions are in the process of replacing face-to-face interactions. Most people never take the time to think about what the world would be like if this were really the case. I live in a small college town, and I can assure anyone interested in such things that student interactions on Friday and Saturday nights are plenty physical —sometimes I can hear them from across the lake! Social media does little more than provide a way of sharing information that enhances the intimacy of eventual physical contact. Anyone who doesn’t know this doesn’t understand the technology.
Of course, people like Harden will point to other sectors of the economy where technological innovation has erased thousands of jobs. People don’t need information from stockbrokers or travel agents to make decent decisions about travel or investment anymore, so why should a living, breathing professor be necessary to convey the sort of information one gets out of a college education? If that information can be distributed more cheaply thanks to virtualization, why should students be expected to bear the extra expense of classroom education?
The answer to this question is so elementary that the objection supporting it is almost hard to take seriously. The truth is that education is not simply the conveying of information. In fact, it is probably only marginally that. How many people remember most of what they learned in college? Only very few, I would guess. The benefit of a classroom education is that it keeps students under a certain amount of mental pressure, forces them to think on the spot, and obliges them to explain themselves to other people who are physically present. Information is afoot in these interactions, but so are wisdom, passion, empathy, and a whole host of other viscera that only an embodied teacher or student can properly convey.
How effective, for instance, do we imagine an online church experience would be compared to the real thing? Is it reasonable to think that a virtual tour of the cathedral at Chartres would be as spiritually moving as being there? We should also consider that many students might simply enjoy the physical classroom and their interaction with peers and professors -- or at least they might recognize that they learn better under these conditions. The costs of classroom education may be soaring out of proportion at present, but this is not a verdict on the education itself.
So let’s ask -- what developments are behind these grim augurs of the collapse of America’s higher education model? Some of it undoubtedly has to do with politics. Many commentators on the right (and perhaps Harden is one of them) would likely cheer the dismantlement of a system whose values are often perceived as far left of center. If taking education online can put “tenured radicals” out of work, then why not welcome it? At the same time, however, just as many moderate and left-leaning thinkers have joined the chorus of those predicting the failure of higher education (for instance, see Thomas Friedman’s recent writings in The New York Times), and it would be simplistic to chalk this latest round of doom-peddling up to politics.
The real culprit, I suggest, is what, for lack of a better term, we might call Appleism. Innocent in principle but nefarious in practice, the doctrine of Appleism holds that increases in technological capability are synonymous with increases in human happiness. Anything that can be put on a screen is better than what can be seen with the naked eye. The passage of electrons through a cathode tube is equivalent to passage from a lower to a higher state of being. Proponents of Appleism hold out technology as an intrinsic good; they are the sorts of folks who compulsively buy the latest Apple product, simply on principle.
We can point to fiscal insolvency all we want, but one has difficulty believing that Harden’s and others’ vision of a fully or almost-fully online education is not also the product of society’s limitless fascination with virtualization. Proponents of the current craze ought to think carefully about the human costs of technology before enthusiastically proclaiming the end of a system that could leave hundreds of thousands of people without work, students cheated out of a quality education, and that would further contribute to the creation of a world where virtualization is always and everywhere, without qualification or questioning, heralded as an unequivocal good.
Louis Betty is an assistant professor of French at the University of Wisconsin-Whitewater.