"Frenzy" may be the best way to describe what’s currently happening in higher education.
On one hand, there’s MOOC (massive open online course) mania. Many commentators, faculty creators, administrators, and public officials think this is the silver bullet that will revolutionize higher education.
On the other hand, there is the call for fundamental rethinking of the higher education business model. This is grounded most often in the argument that the (net) cost structure of higher education is unaffordable to an increasing number of Americans. Commentators point out that every other major sector of the economy has gone through this rethinking/restructuring, so it is only to be expected that it is now higher education’s turn.
Furthermore, it is often claimed that colleges and universities need to disaggregate what they do and outsource (usually) or insource (if the expertise is really there) a re-envisioned approach to getting all the necessary work done.
In this essay I focus on the optimal blending of online content and the software platforms underneath.
Imagine how transformative it would be if we could combine self-paced, self-directed postsecondary learning (which has been around in one form or another for millennia) with online delivery of content that has embedded in it both the sophisticated assessment of learning and the ability to diagnose learning problems, sometimes even before the learner is aware of them, and provide just-in-time interventions that keep the learner on track.
Add to that the opportunity for the learner to connect to and participate in groups of other learners, and, to link directly to the faculty member and receive individualized attention and mentoring. What you would have is the 21st-century version of do-it-yourself college, grounded in but well beyond the experienced reality of the thousands of previous DIYers such as Abraham Lincoln, Frederick Douglass, and Thomas Edison.
A good goal to set for the future? No. The great news is that we already have all the components necessary to make this a reality in the near term. First, it is now possible to build “smart” content delivered through systems that are grounded in neuroscience and cognitive psychological research on the brain mechanisms and behaviors underlying how people actually learn. The Open Learning Initiative at Carnegie Mellon University, which creates courses and content that provide opportunities for research for the Pittsburgh Science of Learning Center (PSLC), is an example of how research can underlie content creation.
Such content and systems depend critically on faculty expertise, in deciding exactly what content is included, in what sequence, and how it is presented. Faculty are also critical in the student learning process, but perhaps not solely in ways we have traditionally thought. That is, it may not be that faculty are critical for the actual delivery of content, a fact we have known for millennia given that students obtain content through myriad sources (e.g., books) quite successfully.
Still, effective and efficient student learning has always depended critically on how well faculty master both these content steps as well as the other parts of the learning process, as evidenced by the experience with faculty who are experts at doing it and the ease with which learning seems to happen in those situations.
Second, these “smart” systems exist in a context of sophisticated analytics that do two things: (a) monitor what the learner is doing such that it can detect when the learner is about to go off-track and insert a remedial action or tutorial just in time, and (b) assess what the learner knows at any point. These features can be used to set mastery learning requirements at each step such that the learner cannot proceed without demonstrating learning at a specific level.
Ensuring mastery of content has long been a major concern for faculty, who used to have to spend hours embedding pop quizzes or other learning assessments into their courses, set up review sessions, set office hours during which students may (or may not) attend, and implore students to contact them is they encountered difficulties. The dilemma for faculty has usually been figuring out who needs the assistance when and how.
The sophisticated analytics underneath content delivery systems help take the guesswork out of it, thereby enabling faculty to engage with more students more effectively, and, most important, design the engagement to address each student’s specific issue. Better student-faculty interactions will likely do more to improve student learning than most any other intervention.
Third, the platforms on which these “smart” systems are built and delivered include ways to create virtual teams of learners (both synchronously and asynchronously) and to include faculty interaction from one-on-one to one-on-many. This tool will make the long tradition of having students form study groups easier for faculty to accomplish, and enable students whose physical location or schedules may have made it difficult previously to participate in such groups to gain their full benefit.
Fourth, the creation of these “smart” systems has resulted in much clearer articulations of the specific competencies that underlie various levels of mastery in a particular field. As evidenced by the various articulations and degree profile work done in the U.S. and internationally, and by the development of specific competencies for licensure by several professional associations, faculty play a central role.
Fifth, the specification of competencies makes it easier to develop the rubrics by which learning acquired prior to formal enrollment in a college/university or in other ways not otherwise well-documented can be assessed, and the learner be placed on the overall continuum of subject mastery in a target field or discipline. Although faculty have always played a central role in such assessments, standardization of assessment has proven difficult. However, with the inclusion of faculty expertise, assessments such as Advanced Placement exams and learning portfolios can now be accomplished with extremely high reliability.
All of this could have enormous consequences for higher education. To be sure, we need more research and development of a broader array of content and delivery approaches than we currently have. In the meantime, though, three steps can be taken to meet students’ needs and to increase the efficiency with which colleges and universities provide the educated citizens we need:
Define as many postsecondary credentials as possible in terms of specific competencies developed by faculty and practicing professionals. This will provide the bases for developing as many “smart” systems as possible for improved content and learning assessment, and for assessing prior learning.
Meet students at the edge of their learning. Each student that arrives at a college/university is at a different spot along the learning continuum. Previously, we made at best very rough cuts at determining where students should start in a course sequence, for example. But more sophisticated prior learning assessment means we can be much more precise about matching what the student knows and where s/he should connect to a learning sequence. Not only would this approach minimize needless repetition of content already mastered, but it could also provide faster pathways to credentials.
Design personalized pathways to credentials. Better and clearer articulation of what students need to know for a specific credential, plus better assessments of prior and ongoing learning, plus more sophisticated content, plus the opportunity for faculty to engage individually and collectively with students in more focused ways means we can create individual learning plans for students to complete the credentials they need. In essence, a learning gap analysis can be done for each student, indicating at any point in time what s/he still needs to know to achieve a credential. Faculty mentorship can become more intrusive and effective, and a student’s understanding of what and why specific knowledge matters would be deeper.
Institutions that have greater flexibility to address these steps will be the most likely to succeed. I am heartened by the many professors and administrators who are creating the innovative approaches to make the changes real, and to embed them in the culture of their respective institutions. They provide students with superior advising and clearer pathways to achieving the academic credentials students seek. In the longer run, those institutions are likely to see cost structures decline due to more efficient progress through academic programs.
The technology-driven changes described here may well enhance student learning, and help us reach the goal of greater access to higher education for adults of all ages.
But it raises a crucial, and largely unaddressed, question that gets lost in debates about whether costs can be reduced using such technology or whether it will result in fewer faculty jobs.
We have not yet adequately confronted the definition of “faculty” in this emerging, technology-driven environment. Although a thorough discussion of that issue necessarily awaits a different article, suffice it to say that just as technology and costs have changed the job descriptions of people in most other professions, including health care, it has also created new opportunities for those in them. For instance, even though the rise of nurse practitioners has changed key aspects of health care delivery, the demand for more physicians, whose job descriptions may have changed, remains.
In any case, the best part is that these new approaches do not replace the most important aspect of education — the student-teacher interaction. Rather, they provide more effective and efficient ways to achieve it.
John C. Cavanaugh is president & CEO of the Consortium of Universities of the Washington Metropolitan Area.
A long walk through the English countryside and the current flap over the government surveillance of cell phone records touched off my deeply held and unreasoned Luddite reaction to "big data." Like most over-hyped trends, the surge of interest in big data and its application provokes ennui among those of us with some mileage on our sneakers. Gary King of Harvard says that with all the available "big data" students in their freshman year can be given a personalized plan to achieve their lifetime career goals. Harvard Business Review claims that data science is the sexiest new profession. Every day brings us the media hyperbole of the application of big data to commercial, political, and scientific enterprises. While some skeptics have surfaced, the mainstream press continues its love affair with big data.
The long walk I recently took through the English countryside (200 miles in two weeks) reminded me of the value of limited information and gave me unencumbered space to think about my oddly blinkered view of big data. Collecting and analyzing data is after all, how I have made a living for 30 years. Data remain to me the only icon of science left largely unsullied by politics, ego, and money. Perhaps I am just jealous, as HBR suggested the old guard of statisticians, survey methodologists, and data analysts are not equipped to join the brave new world of big data.
What convinced me otherwise was the way my husband and I recently managed to mostly not get lost on the famous yet poorly marked coast-to-coast walk through the English Lake District and Yorkshire Moors. We used a $1.50 plastic compass, survey ordinance maps, a highly schematic guidebook and each other. No GPS, no Google Maps, no iPad or iPhone, no turn-by-turn directions. The simple tools of "compass, map, and thou" are based on substantial abstractions of geographic reality subject to errors of judgment and interpretation. More detailed information would have overwhelmed us as we walked while trying to avoid deep bogs, animal excrement, and slippery precipices in the fog and rain. Decisions made with paper maps, trust, and a little visual triangulation kept us true to our course 90 percent of the time.
And so to big data… The history of science is actually one of reverse engineering. In the beginning, our measurement tools for the physical and social world were so crude that the combination of substantial abstraction and painstaking taxonomic description were the only choices. The grand theories of natural selection and relativity emerged at a time when the data were very sparse and poorly collected. To have any reasoned explanation of the world, scientists of earlier eras had to accept that the empirical world they could observe was quite limited and distorted. Improvements in our tools have allowed us over time to anchor and refine those grand abstractions with a reality closer to what is observed. Still, the world comes to us through a glass, darkly. Until very recently, we have continued to use substantial abstraction to see and understand natural and social phenomena.
The problem with big data is that it is like trying to take a sip of water from a fire hose. "Big" data is really a euphemism for all of the data thrown off by the digital engines that drive our economic and social transactions. Electronic medical records, arrest and conviction records, loyalty card data from the grocery store, all of the stuff you tell OkCupid and Match.com, Google search histories, insurance claims, cell phone calls and even the digital things we create like tweets and blog posts.
Any transaction, business process, or social engagement that uses a machine that records, counts and stores stuff in a digital format generates data. Now people and institutions leave digital footprints everywhere. We used to have to ask questions or collect paper records. Now, it is like slapping a universal bar code on the back of every person and business in the world. Every time they do something, the big barcode scanner in the sky records it and stores it. Data are no longer representing reality but rather are the reality.
The problem of course is that we have almost come full circle. Rather than too little data, poorly measured, we now have too much data, precisely measured. Our ability to use data effectively to make decisions or understand the world depends on our ability to see patterns and abstract from those patterns. Big data is, in many ways, an exact replica of reality. Using big data to make decisions is like using every square inch of soil, landscape, and sky in my 200-mile walk across England to figure out how to get around the corner in the next small village. It feels to me as if we need to return to the time of Linnaeus, the famous Swedish botanist whose pioneering classification of the natural world gave us the concept of the "species," to classify the intersecting and complexly nuanced world thrown off by our digital engines before we start making decisions using this unknown commodity. We need to rebuild those high level abstractions from the ground up to make sense of this new reality.
My difficulty with at least the political and commercial applications of big data is that our tools of abstraction and decision-making are decidedly underdeveloped when faced with this type of data. As long as Netflix doesn’t understand that when I share my account with my early 20-something daughters, their big data application will continue to recommend "Buffy the Vampire Slayer" and "Gossip Girl" to me when my real preferences run to "Masterpiece Theater" and subtitled films. On a more serious note, our real fear of the use of cell phone transaction data to understand the social networks of individuals is not necessarily about the invasion of privacy but the possibility that the wrong person will be identified as a threat because his or her data are taken out of context. It is no longer whether our data are adequate to support our theories but rather whether we have developed adequate theories to explain our highly nuanced data.
Or maybe I am just jealous that Google hasn’t come looking for me…. yet.
Felicia B. LeClere is a senior fellow with NORC at the University of Chicago, where she works as research coordinator on multiple projects. She has 20 years of experience in survey design and practice, with particular interest in data dissemination and the support of scientific research through the development of scientific infrastructure.
Lawyers and a disability rights advocate stressed that faculty members must be proactive rather than reactive in making sure their online courses and materials are accessible for students with disabilities.