"Frenzy" may be the best way to describe what’s currently happening in higher education.
On one hand, there’s MOOC (massive open online course) mania. Many commentators, faculty creators, administrators, and public officials think this is the silver bullet that will revolutionize higher education.
On the other hand, there is the call for fundamental rethinking of the higher education business model. This is grounded most often in the argument that the (net) cost structure of higher education is unaffordable to an increasing number of Americans. Commentators point out that every other major sector of the economy has gone through this rethinking/restructuring, so it is only to be expected that it is now higher education’s turn.
Furthermore, it is often claimed that colleges and universities need to disaggregate what they do and outsource (usually) or insource (if the expertise is really there) a re-envisioned approach to getting all the necessary work done.
In this essay I focus on the optimal blending of online content and the software platforms underneath.
Imagine how transformative it would be if we could combine self-paced, self-directed postsecondary learning (which has been around in one form or another for millennia) with online delivery of content that has embedded in it both the sophisticated assessment of learning and the ability to diagnose learning problems, sometimes even before the learner is aware of them, and provide just-in-time interventions that keep the learner on track.
Add to that the opportunity for the learner to connect to and participate in groups of other learners, and, to link directly to the faculty member and receive individualized attention and mentoring. What you would have is the 21st-century version of do-it-yourself college, grounded in but well beyond the experienced reality of the thousands of previous DIYers such as Abraham Lincoln, Frederick Douglass, and Thomas Edison.
A good goal to set for the future? No. The great news is that we already have all the components necessary to make this a reality in the near term. First, it is now possible to build “smart” content delivered through systems that are grounded in neuroscience and cognitive psychological research on the brain mechanisms and behaviors underlying how people actually learn. The Open Learning Initiative at Carnegie Mellon University, which creates courses and content that provide opportunities for research for the Pittsburgh Science of Learning Center (PSLC), is an example of how research can underlie content creation.
Such content and systems depend critically on faculty expertise, in deciding exactly what content is included, in what sequence, and how it is presented. Faculty are also critical in the student learning process, but perhaps not solely in ways we have traditionally thought. That is, it may not be that faculty are critical for the actual delivery of content, a fact we have known for millennia given that students obtain content through myriad sources (e.g., books) quite successfully.
Still, effective and efficient student learning has always depended critically on how well faculty master both these content steps as well as the other parts of the learning process, as evidenced by the experience with faculty who are experts at doing it and the ease with which learning seems to happen in those situations.
Second, these “smart” systems exist in a context of sophisticated analytics that do two things: (a) monitor what the learner is doing such that it can detect when the learner is about to go off-track and insert a remedial action or tutorial just in time, and (b) assess what the learner knows at any point. These features can be used to set mastery learning requirements at each step such that the learner cannot proceed without demonstrating learning at a specific level.
Ensuring mastery of content has long been a major concern for faculty, who used to have to spend hours embedding pop quizzes or other learning assessments into their courses, set up review sessions, set office hours during which students may (or may not) attend, and implore students to contact them is they encountered difficulties. The dilemma for faculty has usually been figuring out who needs the assistance when and how.
The sophisticated analytics underneath content delivery systems help take the guesswork out of it, thereby enabling faculty to engage with more students more effectively, and, most important, design the engagement to address each student’s specific issue. Better student-faculty interactions will likely do more to improve student learning than most any other intervention.
Third, the platforms on which these “smart” systems are built and delivered include ways to create virtual teams of learners (both synchronously and asynchronously) and to include faculty interaction from one-on-one to one-on-many. This tool will make the long tradition of having students form study groups easier for faculty to accomplish, and enable students whose physical location or schedules may have made it difficult previously to participate in such groups to gain their full benefit.
Fourth, the creation of these “smart” systems has resulted in much clearer articulations of the specific competencies that underlie various levels of mastery in a particular field. As evidenced by the various articulations and degree profile work done in the U.S. and internationally, and by the development of specific competencies for licensure by several professional associations, faculty play a central role.
Fifth, the specification of competencies makes it easier to develop the rubrics by which learning acquired prior to formal enrollment in a college/university or in other ways not otherwise well-documented can be assessed, and the learner be placed on the overall continuum of subject mastery in a target field or discipline. Although faculty have always played a central role in such assessments, standardization of assessment has proven difficult. However, with the inclusion of faculty expertise, assessments such as Advanced Placement exams and learning portfolios can now be accomplished with extremely high reliability.
All of this could have enormous consequences for higher education. To be sure, we need more research and development of a broader array of content and delivery approaches than we currently have. In the meantime, though, three steps can be taken to meet students’ needs and to increase the efficiency with which colleges and universities provide the educated citizens we need:
Define as many postsecondary credentials as possible in terms of specific competencies developed by faculty and practicing professionals. This will provide the bases for developing as many “smart” systems as possible for improved content and learning assessment, and for assessing prior learning.
Meet students at the edge of their learning. Each student that arrives at a college/university is at a different spot along the learning continuum. Previously, we made at best very rough cuts at determining where students should start in a course sequence, for example. But more sophisticated prior learning assessment means we can be much more precise about matching what the student knows and where s/he should connect to a learning sequence. Not only would this approach minimize needless repetition of content already mastered, but it could also provide faster pathways to credentials.
Design personalized pathways to credentials. Better and clearer articulation of what students need to know for a specific credential, plus better assessments of prior and ongoing learning, plus more sophisticated content, plus the opportunity for faculty to engage individually and collectively with students in more focused ways means we can create individual learning plans for students to complete the credentials they need. In essence, a learning gap analysis can be done for each student, indicating at any point in time what s/he still needs to know to achieve a credential. Faculty mentorship can become more intrusive and effective, and a student’s understanding of what and why specific knowledge matters would be deeper.
Institutions that have greater flexibility to address these steps will be the most likely to succeed. I am heartened by the many professors and administrators who are creating the innovative approaches to make the changes real, and to embed them in the culture of their respective institutions. They provide students with superior advising and clearer pathways to achieving the academic credentials students seek. In the longer run, those institutions are likely to see cost structures decline due to more efficient progress through academic programs.
The technology-driven changes described here may well enhance student learning, and help us reach the goal of greater access to higher education for adults of all ages.
But it raises a crucial, and largely unaddressed, question that gets lost in debates about whether costs can be reduced using such technology or whether it will result in fewer faculty jobs.
We have not yet adequately confronted the definition of “faculty” in this emerging, technology-driven environment. Although a thorough discussion of that issue necessarily awaits a different article, suffice it to say that just as technology and costs have changed the job descriptions of people in most other professions, including health care, it has also created new opportunities for those in them. For instance, even though the rise of nurse practitioners has changed key aspects of health care delivery, the demand for more physicians, whose job descriptions may have changed, remains.
In any case, the best part is that these new approaches do not replace the most important aspect of education — the student-teacher interaction. Rather, they provide more effective and efficient ways to achieve it.
John C. Cavanaugh is president & CEO of the Consortium of Universities of the Washington Metropolitan Area.
At a time when many question the relevance of history, it is noteworthy that the U.S. Supreme Court case that prohibited the federal government from undercutting a state’s decision to extend "the recognition, dignity and protection" of marriage to same-sex couples, hinged on arguments advanced by professional historians.
Rarely have historians played as important a role in shaping the outcome of a public controversy as in the same-sex marriage cases. Legal, family, women's, and lesbian and gay historians provided key evidence on which U.S. v. Windsor ultimately turned: that the Defense of Marriage Act (DOMA) represented an unprecedented and improper federal intrusion into a domain historically belonging to the states. As Justice Kennedy affirmed, "the federal government, through our history, has deferred to state law policy decisions with respect to domestic relations."
But historical scholarship did more than substantiate a single pivotal argument. It framed the majority’s broader understanding of marriage as an evolving institution and helped convince five justices that opposition to same-sex marriage is best understood as part of a long history of efforts to deprive disfavored groups of equal rights and benefits. In the end, the majority opinion hinged on "the community’s ... evolving understanding" of marriage and of equality and the majority’s recognition that DOMA imposed "a disadvantage, a separate status, and so a stigma upon all who enter into same-sex marriages made lawful by the unquestioned authority of the states."
Briefs filed with the Supreme Court by the American Historical Association and the Organization of American Historians demonstrated that far from being a static institution, marriage has profoundly changed its definition, roles, and functions, and that today's dominant marital ideal, emphasizing emotional intimacy, has nothing to do with gender. Currently, marriage's foremost public function is to distribute benefits, such as those involving health insurance, Social Security, and inheritance, making it all the more valuable for same-sex couples.
Furthermore, these briefs proved that contrary to the widely held assumption that marriage has long been defined by its procreative function, this was not the case. Marriage was justified on multiple grounds. Especially important were the notions that marriage contributed to social stability and provided care for family members. No American state ever forbade marriage to those too old to bear children.
Without reducing the legal history of marriage to a Whiggish, Progressive. or linear narrative, the historians showed that two broad themes characterize the shifting law of marriage in the United States. The first is the decline of coverture, the notion that a married woman's identity is subsumed in her husband's. A second theme is the overturning of earlier restrictions about who can marry whom.
Slowly and unevenly, American society has abolished restrictions on marriage based on people's identity. As recently as the 1920s, 38 states barred marriages between whites and blacks, Chinese, Filipinos, Japanese, Indians, "Malays," and "Mongolians." It was not until 1967 in Loving v. Virginia, the Supreme Court decision that threw out a Virginia ban on black-white marriages, that racial and ethnic restrictions were outlawed.
At the same time, there has been an ongoing legal struggle to recognize women as full rights-bearers within marriage. Instead of seeing their identity subsumed in their husband's -- the notion that spouses cannot testify against one another was originally rooted in this principle -- women gradually attained the right to sue, control their own wages, and manage their separate property.
Perhaps the most powerful recent symbols of this shift are prosecutions for marital rape and elimination of the presumption that a husband is head of the household for legal purposes. Opposition to the liberalization of marriage, the historians demonstrated, has rested on historical misconceptions and upon animus, rooted in ethnocentrism and religious sectarianism.
Marriage today bears scant resemblance to marriage even half a century ago, when the male breadwinner family prevailed and dual-earner and single-parent households were far rarer than today. The contemporary notion of marriage as an equal, gender-neutral partnership differs markedly not only from the patriarchal and hierarchical ideals of the colonial era, but from the notion of complementary spousal roles that predominated during the age of companionate marriage that prevailed from the 1920s into the mid-1960s.
Change, not continuity, has been the hallmark of the history of marriage. Even before the 20th century, marriage underwent certain profound transformations. Landmarks in this history included:
Enactment of the first Married Women's Property laws in the 1830s and 1840s, which established women's right to control property and earnings separate and apart from their husbands.
Passage of the first adoption laws in the mid-19th century, allowing those unable to bear children to rear a child born to other parents as their own.
Increased access to divorce, beginning with judicial divorce supplanting legislative divorce.
The criminalization of spousal abuse starting in the 1870s.
Marriage's persistence reflects its adaptability. DOMA represented an unprecedented federal attempt to fix the definition of marriage and impose this definition upon the states and their inhabitants. Specifically, DOMA represented a federal effort to prohibit lesbian and gay Americans from securing the same civil rights and benefits available to other citizens. DOMA stigmatized a specific group of Americans and represented federal discrimination based on a particular religious point of view. In Justice Kennedy’s ringing words: "The federal statute is invalid, for no legitimate purpose overcomes the purpose and effect to disparage and to injure those whom the state, by its marriage laws, sought to protect in personhood and dignity."
History, in the same-sex marriage controversy, was not simply "preface" -- an interesting but ultimately insignificant detail in cases involving equal treatment under law. History lay bare a series of dangerously misleading assumptions -- above all, the notion that same-sex marriage deviates from a timeless, unchanging marital norm.
Steven Mintz, professor of history at the University of Texas at Austin and the author of Domestic Revolutions: A Social History of American Family Life and Huck’s Raft: A History of American Childhood, signed the American Historical Association brief.
Once it would have been possible to jump right into a discussion of Michael Gordin’s The Pseudoscience Wars: Immanuel Velikovsky and the Birth of the Modern Fringe (University of Chicago Press) with the reasonable assumption that readers would have at least a nodding acquaintance with the maverick psychoanalyst’s ideas.
But today -- as Gordin, a professor of history at Princeton University, notes -- few people under the age of 50 will recognize Velikovsky’s name, much less know of his theory of the traumatic impact of cosmic catastrophes on human history. It was a heated topic for discussion in the 1970s. I recall seeing a poster for a meeting of Chaos and Chronos, a student organization dedicated to Velikovskian matters that once had clubs on many U.S. college campuses. This was as late as 1980 or ’81. (Which only corroborates Gordin’s point, for I am approaching the half-century mark at an alarming speed.)
So, first, a lesson in now-dormant controversy.
Although he published several other books during his lifetime, plus a few more posthumously, Velikovsky presented his core argument in a volume called Worlds in Collision (1950). It was an attempt to formulate the key to all mythologies, or at least an explanation of some of the more striking stories and beliefs of antiquity. Drawing on sources both classical and obscure, he showed that cultures all over the world preserved narratives in which the world passed through incredible catastrophes: the earth shook, the heavens darkened, the sun stood still, floods wiped out society, fire or stones or both fell from the sky, and so on. The cultures that preserved the tales explained the events as a manifestation of God’s wrath at humanity, or as the consequence of gods’ behavior toward one another.
An orthodox Freudian, Velikovsky had no use for Jung’s nebulous ideas about archetypes in the collective unconscious. His theory was more concrete, if no less strange. The far-flung legends all made sense as distorted accounts of a series of astronomical anomalies beginning circa 1500 B.C.E., when (he argued) a huge mass of matter broke off the planet Jupiter and spun off into space. It passed dangerously close to Earth a couple of times before eventually settling into orbit as the planet we now know as Venus.
Its comet-like transit through the solar system generated a series of events, both in outer space and here below, that continued for the better part of a thousand years. Earth and proto-Venus came near enough to affect each other’s orbits, and that of Mars as well. Bewildered by the strange things happening in the sky, mankind endured terrestrial upheaval on an incredible scale -- tectonic disasters, weird weather, and shifts of the planet’s axis, for example.
Once, when proto-Venus came close to Earth, its atmosphere permeated our own long enough to precipitate a fluffy, snow-like substance made of hydrocarbons. And so it came to pass that the Israelites received the manna falling from heaven that the Lord did send to nourish them.
Well, it’s a theory, anyway.Harper’s magazine ran an article about Velikovsky’s book in advance of its publication. Other, less sober publications followed suit, presenting Worlds in Collision as demonstrating the literal (albeit distorted) truth of events recorded in the Bible. The response by scientists was less enthusiastic, to put it mildly. The word “crackpot” tended to come up. Velikovsky’s interdisciplinary erudition impressed them only as evidence that he was profoundly ignorant in a number of fields. The American people would be dumber for reading the book, and so on.
Upon seeing the early publicity for Velikovsky’s book, some scientists were so disgusted that they wrote to Velikovsky’s publisher, Macmillan, to complain. Worlds in Collision had been listed in the firm’s catalog as a scientific work. The letter-writers considered this disgraceful, and warned of the potential damage to the press’s reputation in the scientific community. After a few university science departments canceled their meetings with Macmillan’s textbook salesmen, the publisher became alarmed and sold its right to the book to Doubleday.
Velikovsky was unhappy about this, but the deal was hard on Macmillan as well. At its peak, Worlds in Collision was selling a thousand copies a week, despite being a rather pricey hardback. The backstage furor soon died down, as did public interest in Velikovsky’s claims. By 1951, his ideas must have seemed as if they would have no more of a future than the other big fad of the previous years, L. Ron Hubbard’s Dianetics. (The scholarly literature seems to have overlooked this bit of synchronicity, though I’m sure there is a master’s thesis in it for somebody.)
The Worlds in Collision affair might have been forgotten entirely if not for a special issue of the journal American Behavioral Science devoted to the whole matter, published in 1963. The contributors were interested not so much in Velikovsky’s ideas as in how scientists had responded to them – with peremptory dismissals based on the Harper’s article, emotional rhetoric, and behind-the-scenes pressure on his publisher. It amounted to censorship and the repression of ideas – the assertion of scientific authority against a theory, despite the lack of serious engagement with the book itself.
Velikovsky and Albert Einstein both lived in Princeton, N.J., during the 1950s, and Velikovsky could quote remarks from the physicist’s letters and conversation suggesting that Worlds in Collision was at least interesting and worthy of a hearing. This was by no means the only thing Einstein had to say. Gordin quotes a number of occasions when Einstein described Velikovsky as “crazy” -- and clearly he regarded the man as a pest, at times. But it's not difficult to imagine why the most famous Jewish immigrant in postwar America might develop sympathetic feelings for someone of a comparable background who seemed to be facing unfair persecution. Besides, they could speak German together. That counted for a lot.
In any case, their friendship also made it easier to argue that Velikovsky just might be too far ahead of his time. In the mid-1960s, students at Princeton University formed a discussion group on Worlds in Collision, and Velikovsky himself spoke there – the first of what became many lectures to packed halls. Given the spirit of the time, having been rejected and anathemized by the scientific establishment was, in its own way, a credential. Among young people, Velikovsky enjoyed the special authority that comes when mention of one’s ideas is sufficient to annoy, very noticeably, one’s professors.
In 1972, the editors of Portland State University’s student magazine Pensée turned it into a forum defending and developing Velikovsky’s ideas. Papers were peer-reviewed, sort of: they were submitted to scholars and scientists for vetting, though most of the reviewers were sympathetic to Velikovsky (and, it sounds like, also contributors). Pensée’s first all-catastrophism issue clearly met a need. It had to be reprinted twice and sold 75,000 copies, after which the journal’s circulation settled down to a still-remarkable 10-20,000 copies per year.
And if any more evidence of his status as countercultural eminence were needed, the American Association for the Advancement of Science held a Velikovsy symposium at its annual conference in February 1974. The most famous participant was the astronomer Carl Sagan, who challenged the author’s supposed scientific evidence for the cosmic-catastrophe scenario at considerable length. Velikovsky and his supporters were angry that all of the invited speakers were critical of his work. But the organizers invited Velikovsky himself to respond, which he did, also at considerable length. The symposium may not have vindicated Velikovsky, but it gave him an unusually prominent place at the table
He died in 1979, and five years later Henry Bauer (now an emeritus professor of chemistry and science studies at Virginia Tech) published Beyond Velikovsky: The History of a Public Controversy (University of Illinois Press). It was the first book-length analysis of the whole saga and, for a long time, the last. Most of the secondary literature on Velikovsky appearing since his death resembles the material about him published during his lifetime, in that it is polemical, for or against. The one published biography of Velikovsky that I know of, drawing on his own memoirs, is by his daughter.
So Gordin’s The Pseudo-Science Wars belongs to the fairly small number of studies that do not simply pour the old controversies into new bottles. In that regard, the title is something of a fake-out. The author doesn’t treat Velikovsky’s catastrophism as a variety of pseudoscience. He is dubious about the concept, both because it is applied to too many phenomena that don’t share anything (what do astrology, cold fusion, biorhythms, and the study of how ancient astronauts shaped human evolution really have in common?) and because no one has established an epistemological “bright line” to distinguish science-proper from its pretenders. The word’s real significance lies in its use in shoring up the authority of those who use it. Calling something pseudoscience is more profoundly delegitimizing than calling it bad science.
Happily the author spends only a little time on Sociology of Deviance 101-type labeling theory before getting down to the altogether more compelling labor of using archival material that was unavailable to Bauer 30 years ago -- especially the theorist’s personal papers, now in Princeton’s collection. Velikovsky was something of a packrat. If he ever parted with a document, it cannot have been willingly. The Pseudo-Science Wars fills in the familiar outline of his career and controversy, as sketched above, with an abundance of new detail as well as insight into what Gordin calls “the development of Velikovskian auto-mythology.”
We learn, for one thing, that tales of a deliberate campaign of letter-writing and well-organized pressure on Macmillan through the threat of a boycott have little evidence to back them up. Accounts treating Velikovsky as an American Galileo typically suggest that his opponents wanted to prevent his ideas from receiving any hearing at all – that they were, in principle if not in method, book-burners.
But the existing documents suggest that the scientific community was chiefly troubled at seeing Worlds in Collision issued under the full authority of a major science and textbook publisher. A number of scientists responded by exerting pressure on Macmillan, but Gordin says the letters “were disorganized, uncoordinated, and threatened different things – some not to buy books, some not to referee manuscripts, others not to write them.”
Textbooks represented up to 70 percent of the publisher’s revenue, so professorial displeasure “had to be taken seriously. Macmillan could not afford to call it a bluff.” Once Worlds in Collision was sold to Doubleday (a trade publisher) scientists were content to mock the author’s grasp of geology, chemistry, celestial mechanics, etc. – or simply to ignore the book altogether.
Velikovsky converted the episode into a kind of moral capital, and Gordin demonstrates how shrewdly he and his admirers used it to build a scientific counter-establishment -- what one might otherwise call a full-scale pseudoscience. The analysis requires a number of detours, heading into territory where intellectual historians seldom venture – as in the sad tale of Donald Wesley Patten, author and publisher of The Biblical Flood and the Ice Epoch (1966), who ultimately proved too Velikovskian for the fundamentalists, and vice versa.
For a long time it seemed as if no one could go beyond Beyond Velikovsky. Gordin's book does not replace the earlier study, which remains an interesting and valuable book, and certainly worth the attention of anyone trying to decide whether to explore the terrain in more detail. But The Pseudoscience Wars puts the catastrophist’s ideas and aura into a wider and thicker context of ideas, people, and institutions -- a remarkable array, spanning from the Soviet genetics debates of the 1940s to today's fractious niche of (please accept my sincere apology for this next word) post-Velikovskyism.
Speaking of which, let me end with a prediction. While reading Gordin, it crossed my mind that the scenario of upheaval in Worlds in Collision might well speak to the sense of how precarious our little ball in space really is. Americans are not a thrifty people, but we do tend to recycle our cultural phenomena, and if there is one 20th-century idea that seems a likely candidate for 21st-century revival, it is probably catastrophism.