"Frenzy" may be the best way to describe what’s currently happening in higher education.
On one hand, there’s MOOC (massive open online course) mania. Many commentators, faculty creators, administrators, and public officials think this is the silver bullet that will revolutionize higher education.
On the other hand, there is the call for fundamental rethinking of the higher education business model. This is grounded most often in the argument that the (net) cost structure of higher education is unaffordable to an increasing number of Americans. Commentators point out that every other major sector of the economy has gone through this rethinking/restructuring, so it is only to be expected that it is now higher education’s turn.
Furthermore, it is often claimed that colleges and universities need to disaggregate what they do and outsource (usually) or insource (if the expertise is really there) a re-envisioned approach to getting all the necessary work done.
In this essay I focus on the optimal blending of online content and the software platforms underneath.
Imagine how transformative it would be if we could combine self-paced, self-directed postsecondary learning (which has been around in one form or another for millennia) with online delivery of content that has embedded in it both the sophisticated assessment of learning and the ability to diagnose learning problems, sometimes even before the learner is aware of them, and provide just-in-time interventions that keep the learner on track.
Add to that the opportunity for the learner to connect to and participate in groups of other learners, and, to link directly to the faculty member and receive individualized attention and mentoring. What you would have is the 21st-century version of do-it-yourself college, grounded in but well beyond the experienced reality of the thousands of previous DIYers such as Abraham Lincoln, Frederick Douglass, and Thomas Edison.
A good goal to set for the future? No. The great news is that we already have all the components necessary to make this a reality in the near term. First, it is now possible to build “smart” content delivered through systems that are grounded in neuroscience and cognitive psychological research on the brain mechanisms and behaviors underlying how people actually learn. The Open Learning Initiative at Carnegie Mellon University, which creates courses and content that provide opportunities for research for the Pittsburgh Science of Learning Center (PSLC), is an example of how research can underlie content creation.
Such content and systems depend critically on faculty expertise, in deciding exactly what content is included, in what sequence, and how it is presented. Faculty are also critical in the student learning process, but perhaps not solely in ways we have traditionally thought. That is, it may not be that faculty are critical for the actual delivery of content, a fact we have known for millennia given that students obtain content through myriad sources (e.g., books) quite successfully.
Still, effective and efficient student learning has always depended critically on how well faculty master both these content steps as well as the other parts of the learning process, as evidenced by the experience with faculty who are experts at doing it and the ease with which learning seems to happen in those situations.
Second, these “smart” systems exist in a context of sophisticated analytics that do two things: (a) monitor what the learner is doing such that it can detect when the learner is about to go off-track and insert a remedial action or tutorial just in time, and (b) assess what the learner knows at any point. These features can be used to set mastery learning requirements at each step such that the learner cannot proceed without demonstrating learning at a specific level.
Ensuring mastery of content has long been a major concern for faculty, who used to have to spend hours embedding pop quizzes or other learning assessments into their courses, set up review sessions, set office hours during which students may (or may not) attend, and implore students to contact them is they encountered difficulties. The dilemma for faculty has usually been figuring out who needs the assistance when and how.
The sophisticated analytics underneath content delivery systems help take the guesswork out of it, thereby enabling faculty to engage with more students more effectively, and, most important, design the engagement to address each student’s specific issue. Better student-faculty interactions will likely do more to improve student learning than most any other intervention.
Third, the platforms on which these “smart” systems are built and delivered include ways to create virtual teams of learners (both synchronously and asynchronously) and to include faculty interaction from one-on-one to one-on-many. This tool will make the long tradition of having students form study groups easier for faculty to accomplish, and enable students whose physical location or schedules may have made it difficult previously to participate in such groups to gain their full benefit.
Fourth, the creation of these “smart” systems has resulted in much clearer articulations of the specific competencies that underlie various levels of mastery in a particular field. As evidenced by the various articulations and degree profile work done in the U.S. and internationally, and by the development of specific competencies for licensure by several professional associations, faculty play a central role.
Fifth, the specification of competencies makes it easier to develop the rubrics by which learning acquired prior to formal enrollment in a college/university or in other ways not otherwise well-documented can be assessed, and the learner be placed on the overall continuum of subject mastery in a target field or discipline. Although faculty have always played a central role in such assessments, standardization of assessment has proven difficult. However, with the inclusion of faculty expertise, assessments such as Advanced Placement exams and learning portfolios can now be accomplished with extremely high reliability.
All of this could have enormous consequences for higher education. To be sure, we need more research and development of a broader array of content and delivery approaches than we currently have. In the meantime, though, three steps can be taken to meet students’ needs and to increase the efficiency with which colleges and universities provide the educated citizens we need:
Define as many postsecondary credentials as possible in terms of specific competencies developed by faculty and practicing professionals. This will provide the bases for developing as many “smart” systems as possible for improved content and learning assessment, and for assessing prior learning.
Meet students at the edge of their learning. Each student that arrives at a college/university is at a different spot along the learning continuum. Previously, we made at best very rough cuts at determining where students should start in a course sequence, for example. But more sophisticated prior learning assessment means we can be much more precise about matching what the student knows and where s/he should connect to a learning sequence. Not only would this approach minimize needless repetition of content already mastered, but it could also provide faster pathways to credentials.
Design personalized pathways to credentials. Better and clearer articulation of what students need to know for a specific credential, plus better assessments of prior and ongoing learning, plus more sophisticated content, plus the opportunity for faculty to engage individually and collectively with students in more focused ways means we can create individual learning plans for students to complete the credentials they need. In essence, a learning gap analysis can be done for each student, indicating at any point in time what s/he still needs to know to achieve a credential. Faculty mentorship can become more intrusive and effective, and a student’s understanding of what and why specific knowledge matters would be deeper.
Institutions that have greater flexibility to address these steps will be the most likely to succeed. I am heartened by the many professors and administrators who are creating the innovative approaches to make the changes real, and to embed them in the culture of their respective institutions. They provide students with superior advising and clearer pathways to achieving the academic credentials students seek. In the longer run, those institutions are likely to see cost structures decline due to more efficient progress through academic programs.
The technology-driven changes described here may well enhance student learning, and help us reach the goal of greater access to higher education for adults of all ages.
But it raises a crucial, and largely unaddressed, question that gets lost in debates about whether costs can be reduced using such technology or whether it will result in fewer faculty jobs.
We have not yet adequately confronted the definition of “faculty” in this emerging, technology-driven environment. Although a thorough discussion of that issue necessarily awaits a different article, suffice it to say that just as technology and costs have changed the job descriptions of people in most other professions, including health care, it has also created new opportunities for those in them. For instance, even though the rise of nurse practitioners has changed key aspects of health care delivery, the demand for more physicians, whose job descriptions may have changed, remains.
In any case, the best part is that these new approaches do not replace the most important aspect of education — the student-teacher interaction. Rather, they provide more effective and efficient ways to achieve it.
John C. Cavanaugh is president & CEO of the Consortium of Universities of the Washington Metropolitan Area.
Among the mountains of literature dedicated to "best practices" in pedagogy, the consensus has emerged that engagement is key, and that we teachers can no longer – as we did throughout history – willfully try to drag students violently by the ear into our own umwelt and call it learning. Rather we need to create an active halfway space between world-bubbles, thus allowing learning to happen more organically, through a mutual reorientation.
This is precisely what I tried to do in a recent course exploring the topic of reality TV. Here I was either brave or foolish enough to structure the class like an actual reality TV competition. And while I admit the initial thrill of conception involved the perverse prospect of voting students "off the island," I could not have anticipated the pedagogical benefits of such a novel format until I tried them out. The first half of the course was quite traditional, with scholarly readings about the history of the genre, and related themes such as narcissism, exhibitionism, attention economies, surveillance, and the new employment option of simply being watched (There is an excellent book on this topic by Mark Andrejevic, which served as the main textbook). It is truly remarkable how much more conscientious students suddenly become when they are informed that an A on the dreaded midterm paper will earn them "immunity" from the first challenge.
The competition section was loosely based on "Project Runway," which emerged from my own institution, the New School, in New York City (specifically the design school, Parsons). Students would be given a challenge a week – some individual, some in groups – and then face a revolving group of expert "judges" to see how well their response connected to the critical aspects of the readings. (I tried to juggle the dual roles of Tim Gunn and Heidi Klum in this scenario, dispensing equal parts encouragement and fear with each alternate comment.) Examples of challenges include, "pitch your own (progressive) reality TV show," "create your own (self-reflexive) reality TV persona," and "report back from your own Thanksgiving holiday as if it were a reality TV show.”
After each challenge the “contestants” would reflect on the competition via "confession cams" recorded on their own laptops or phones, and posted to the blog (a meta-meta exercise in self-reflection, given that reality TV is already a meta-phenomenon). Instead of running around a fabric store, trying to buy enough satin or leather to make an edgy, fashionable dress in less than an hour, my students were running around the library, trying to find appropriate readings to supplement the syllabus. (Those who were voted off switched to the "production" side of the competition: some helping with filming, sound, editing, etc. Others worked on publicity around the college and online, as well as making their own commentaries on the unfolding events. It was therefore possible to be voted off early, but still get an A.)
One of the most striking differences between the students’ umwelt and my own became clear from the very beginning, when I initially took great pains to reassure the class that while we would be filming sections of the competition for archival purposes – and to heighten the sense of being on TV – these would not be made public in any way. To my surprise, all the students were disappointed, going so far as to say, "Well what’s the point in filming it then?!" This emphatic question – and the new Facebook-saturated Zeitgeist that it distils – then became a touchstone for the whole semester, concerning naive assumptions about identity, action, performance, and modes of witnessing. Why is it that the millennial generation does not think anything is worth doing or experiencing unless it is immediately "shared" and "liked" online? How might this backfire when it comes to friends or future employers? And who benefits most from this automatic compulsion?
So what began as a "so-crazy-it-might-work" idea soon revealed itself to be a new way for students to critically reconstruct their own relationship to the media – and thus to themselves – while also shaking up all my cherished notions about traditional modes of teaching the humanities. Whereas the host of "Project Runway" encourages the contestants to "make it work," I exhorted the students to "think it through" (indeed, I was tempted to call the course "So You Think You Can Think?"). And in one of those perfect moments of synchronicity, I could even offer the perfect prize to the winner: a paid internship to work on a film about reality TV by one of my former students, Valerie Veatch (whose first film, "Me at the Zoo," on viral celebrity and its discontents, recently premiered at Sundance).
What’s more, I am almost grateful that the National Security Agency global spying scandal did not erupt during the first run of this course, even as it would have spectacularly underscored the social and political tendencies which the class was designed to question. Even if we loathe reality TV, and claim to never watch it, that doesn’t mean we haven’t all been engulfed in its logic, mannerisms, motifs, conventions, and conceits. One reason I designed the course was to test my theory that even young people who feel themselves to be far above televisual trash are still exposed to, and shaped by, the emotional currents in creates in the world. Reality TV threatens to eclipse reality itself, even in those rare moments when the cameras aren’t running.
Quite simply, identity is now influenced by things like the confession cam, the idea of immunity, and the asymmetrical power dynamics of "the judges." Even as our most significant political figures threaten to become little more than grotesque characters in the latest installment of "The Real Housewives of Congress" or "The Vatican’s Next Top Pontiff." So while the challenge of education is to almost literally burst each other’s bubbles, the bigger challenge is to figure out – across the generations – how to stop our collective umwelt being shaped by this omnipresent model of thought and behavior.
Dominic Pettman is professor of culture and media at Eugene Lang College and New School for Social Research, where he recently won the University Distinguished Teaching Award. His most recent book is Look at the Bunny: Totem, Taboo, Technology.