When the dystopian television drama Westworld, based on the 1973 Michael Crichton movie, premiered last season, no one anticipated that it would overtake Game of Thrones as the most watched first season of any HBO original series. Westworld is a futuristic theme park inhabited by robot hosts who are indistinguishable from their genetically human counterparts. Because they follow the rules of their programmers, the first of which is that “a robot may not injure a human being or, through inaction, allow a human being to come to harm,” visitors to Westworld may do to them “what people do when they think no one is watching.”
However, this cycle is broken when one of the robots begins gaining consciousness, signified by confronting a choice about whether to escape Westworld or return to find her daughter, trapped in a reality doomed to repeat itself. The series raises fascinating questions about the qualities of consciousness, the identity of persons, the compatibility of free will and determinism, and the nature and scope of morality. And the fact is, in the future, we will not be able to continue to sidestep the ethical and policy issues inextricably linked to the use of technology. Scientific advancements will render questions of free will and determinism and individual and social responsibility unavoidable.
Henry Kissinger highlighted such philosophical conundrums in a recent piece in The Atlantic, lamenting that “in every way -- human society is unprepared for the rise of artificial intelligence.” As an historian, he wondered “what would be the impact on history of self-learning machines -- machines that acquired knowledge by processes particular to themselves, and applied that knowledge to ends for which there may be no category of human understanding.”
While Kissinger briefly entertains science fiction scenarios like the ones in Westworld, where AI turns on its creators, he is much more focused on the capacity of AI to develop slight deviations from human instructions that could cascade into catastrophic departures. “What will become of human consciousness if our own explanatory power is surpassed by AI,” he asks, “and societies are no longer able to interpret the world they inhabit in terms that are meaningful to them?” He makes an urgent plea for the creation of a national vision exploring the transformation of the human condition that has been prompted by AI -- one which connects the rise of technology in relation to the humanistic traditions.
Of course, such a vision is needed for more than AI alone, as I recognize every day in my work as a medical ethicist. We live in a society, for example, in which technological advancements have preceded thoughtful reflection regarding the ethical, legal and social implications of the use of that technology with respect to when and how people should be allowed to die.
Take the case that I encountered early in my career of a physician’s self-described moral distress about a 60-year-old woman who had been diagnosed with terminal liver cancer. She had made plans for the eventuality of her death by signing a living will, expressing her wishes to have life-sustaining treatment withheld if the burdens of treatment were likely to outweigh the benefits. Further, she made clear that she did not want to be resuscitated if death were imminent and she suffered cardiac arrest.
She was brought to the hospital by ambulance a day later after her husband discovered her in bed, unconscious and blood soaked, after swallowing a bottle of tranquilizers and cutting her wrists with a butcher’s knife. The family physician was absolutely convinced that she would not want to be resuscitated. In fact, he was concerned that if she survived, she would have him charged with battery for going against her wishes by trying to save her life. But he was cognizant that if he failed to treat her aggressively, he could be charged with assisted suicide, which was a felony in the state. He also knew that living wills were not binding in responding to acts of attempted suicide, and he ultimately performed CPR when she went into cardiac arrest, had her intubated and stitched her up.
As he suspected, when his patient regained consciousness, she was furious. She tried to rip out the tubes and demanded that all treatment be stopped. A psychiatric consult was brought in to assess the patient’s competency; she was deemed competent to refuse treatment, was extubated and died six hours later.
Though at first convinced that he had ultimately done the right thing under the circumstances, the physician regretted his part in prolonging his patient’s suffering. He believed that to meet his obligation to his patient based on a professional duty to do no harm and to relieve suffering, he would have to go against his own self-interest in violating a legal code. In weighing his self-interest against the interest of another, he was forced to come to grips not only with his patient’s but with his own humanity.
In the future, society will not be able to continue to avoid the ethical and policy issues inextricably linked to the use of medical technology. Thus, questions that policy makers need to address in an open discussion include: How should we, as a society, allocate scarce medical resources? Can individualism be excessive in matters of life and death? How can we balance the values of pluralism and tolerance on the one hand against principles of fairness to all on the other? And most important, should our society continue to view death as a failure and, thus, distinctly un-American?
All this means that we in higher education must also confront the question of how we best prepare students for the future and escape a Westworld-like existence. Understanding the dangers of an exaggerated trust in the efficacy of the methods of natural science applied to all areas of investigation, nearly five decades ago, Paul Feyerabend warned against a lapse on the part of scientists into scientism in his book Against Method. Scientism is a doctrine according to which all genuine knowledge is scientific knowledge, reifying the scientific method as the only legitimate form of inquiry.
Despite Feyerabend’s admonition, science’s success in explaining the world has led to a cultural misappropriation in a way that has conflated science with scientism. The profound societal impact of this conflation has led astrophysicist Adam Frank to challenge defenders of scientism by calling for a clarification of how scientism manifests itself in order to “help us understand the damage it does to the real project that lies ahead of us: building space for the full spectrum of human beings in a culture fully shaped by science.” Taking up Frank’s charge to consider how scientism manifests itself, and in particular how the metaphysics of consciousness offers the tools necessary for building the space to which he refers, we need to ask, “What would we lose, if anything, by reducing all learning and engagement to practices only rooted in the sciences?”
Indeed, that is precisely the question we should be raising when designing a curriculum for the 21st century. The illumination of human consciousness through literature, philosophy, music and the arts enriches the experience of individuals alone and as members of a community, allowing us to flourish fully as human beings. The illumination and the inquiry are themselves intrinsic goods that thwart the notion of scientific knowledge as singularly capable of responding to the world’s challenges, exactly because they may turn out to be just as valuable in fostering a capacity to grapple with complexity that cannot be resolved through the scientific method.
As Feyerabend reminds us, true scientists are not scientistic -- they possess a much more nuanced and complex understanding that sensibilities cannot be gained through scientific practices. Science is a tool to investigate metaphysical and epistemological claims. But value also comes from reflecting on experiences in a way that arouses the very sensibilities that enable us to deal with the metaphysics of being human and conscious of living in the world.
The liberal education we offer to our students is a sensibility rather than a group of subjects. Good critics of literature can bring us into a sphere of experience that combines allusions to the past with what is happening in the world right now. Like philosophers, artists and historians, they are capable of speaking to a universality of experience. In the end, it is this phenomenological engagement with the liberal arts that is incapable of being translated through scientism.
Therefore, we must offer a curriculum in which assignments make clear the relationships among areas of knowledge, ensuring that students do not see academic disciplines as separate and disconnected silos of learning but rather as varied approaches to the same enlightened end. This conclusion was validated in a report, "Branches of the Same Tree," recently issued by the National Academies of Sciences, Engineering and Medicine. I served on the committee, which was directed to examine whether the integration of arts and humanities with science, engineering, math and medicine can improve learning outcomes for all students. The title of the report was taken from a quote by Albert Einstein, who, in a letter written in 1937 against the backdrop of burgeoning fascist power in central Europe, expressed consternation over “the dangerous implications of living in a society, where long-established foundations of knowledge were corrupted, manipulated and coerced by political forces.” Einstein maintained that “all religions, arts, and sciences are branches from the same tree.”
The report found the need to “achieve more effective forms of capacity-building for 21st-century workers and citizens,” through the acquisition of broad-based skills from across all disciplines “that can be flexibly deployed in different work environments across a lifetime.” It concludes that “In a world where science and technology are major drivers of social change, historical, ethical, aesthetic and cultural competencies are more critical than ever. At the same time, the complex and often technical nature of contemporary issues in democratic governance demands that well-educated citizens have an appreciation of the nature of technical knowledge and of its historical, cultural and political roles in American democracy.” For “truly robust knowledge depends on the capacity to recognize the critical limitations of particular ways of knowing,” and “to achieve the social relations appropriate to an inclusive and democratic society.”
Big Questions and Grand Challenges
Thus, fulfilling the promise of American higher education requires a curriculum that emphasizes essential learning outcomes (knowledge of human cultures and the physical and natural world, intellectual and practical skills, personal and social responsibility, integrative and applied learning) as necessary for all students’ intellectual, civic, personal and professional development and success. On this model, disciplinary work remains foundational, but students are provided with practice connecting their discipline with others and with the needs of society in preparation for work, citizenship and life. A liberal education for the 21st century requires replacing traditional curricular models that follow previous patterns of depth and breadth with those that provide hands-on experience with unscripted, real-world problems across disciplines.
Developing this type of deeper-level understanding across subject areas, connecting knowledge to experience and adopting a holistic approach to evidence-based problem solving that incorporates diverse and sometimes contradictory points of view is one of the best approaches to cultivating the perception, intellectual agility and creative thinking necessary for them to thrive in a globally interdependent, innovation-fueled economy. Yet, most important, it recognizes that decision making must be grounded in the ethical principles of respect for persons, justice and beneficence.
The ability to engage and learn from experiences different from one’s own and to understand how one’s place in the world both informs and limits one’s knowledge is inextricably linked to the crucial capacity to understand the interrelationships between multiple perspectives -- personal, social, cultural, disciplinary, environmental, local and global. This understanding is pivotal for bridging cultural divides, necessary for working collaboratively to achieve our shared objectives around solving the world’s most pressing problems -- which is all the more reason colleges need to redouble our focus on world citizenship and the interdependence of all human beings and communities as the foundation for education.
These lessons are more important than ever as we prepare graduates for the ever-shifting landscape of tomorrow. Students must be asked to demonstrate an understanding of complex and overlapping worldwide systems, how these systems are influenced and constructed, operate with differential consequences, affect the human and natural world, and perhaps most important, how they can be altered. Students should be asked to apply an integrated and systemic understanding of the interrelationships between contemporary and past challenges facing cultures, societies and the natural world on the local and global levels. Integrative learning and thematic pathways that address grand challenges across disciplines and within the major, requiring students to integrate and apply their knowledge to new problems, are imperative for a 21st-century curriculum.
By asking all students to address big questions and grand challenges, we lead them to test the edges of their own ambition. In the process of learning across difference and connecting their courses with issues and communities beyond the classroom, they develop enhanced ethical reasoning and judgment and a sense of responsibility to self and others, acquire empowering knowledge, and gain new levels of agency. Sociobiologist E. O. Wilson’s cogent observation that contemporary society is “drowning in information, while starving for wisdom” was accompanied by his prediction that “the world henceforth will be run by synthesizers, people able to put together the right information at the right time, think critically about it, and make important choices wisely.” Wilson’s comments highlight both the value of a liberal education and the ideal of an educated citizenry in an age when the democratization of information through the internet has given rise to a new wave of anti-intellectualism -- one steeped in the denial of reason and the distrust and disdain of experts.
The result has been increasing polarization and an entrenched refusal to countenance opposing points of view, contributing to a marketplace of ideas at risk of falling prey to those who have the resources to control the shaping of public opinion and policies. In this arena, asserted claims become orthodoxy despite the absence of evidence and in the face of enduring questions. In this ostensibly post-truth era, addressing the misinformation and incivility resulting from the debilitating impact of a rhetoric-for-hire that has challenged both research expertise and the value of higher education is more urgent than ever.
It is time for leaders in higher education to reassert the role that liberal education can play in discerning the truth and enhance the reputation of our institutions by emphasizing big-picture, problem-centered inquiry and students’ active engagement in experiential learning, with increasing rigor, across all disciplines, in transformational partnerships with other colleges, universities and communities around the globe. If we fail to do so, I fear that we risk confronting our own need to escape Westworld.