Devonyu/Getty Images Pro
The issue is widespread and seemingly intractable. I first noticed it in graduate school when tutoring engineering students on study skills. I observed it over decades of leadership of technology development teams, through interacting with various industries and by following the societal dialogue—or lack thereof—on nearly every significant issue of our time. I’ve seen it in my personal evolution.
When it comes to being able to make flexible, productive and skilled judgments about just about anything uncertain and complicated, many adults struggle. As a tutor I could teach a process for studying, but the student’s judgments about what would be on the test or when their work needed improvement seemed immune to instruction. I didn’t know where to begin given the time I had with them. Likewise, I observed many workers become wiser through slow on-the-job training, but there were some whose perspectives seemed frozen in place, and I had little idea how to accelerate their progress. I look at my younger self and see a strange combination of naïveté and overconfidence, but I realize that’s the place where most adults start.
The issue is weakness in durable, cross-cutting cognitive skills. We need more adults to be capable of continuous learning and wise decision-making, of approaching the right problems in creative ways and combining contributions with others. Those qualities are way more important than knowledge in any field, which in the hottest professions is obsolete in a few years.
The desire for durable, 21st-century skills hasn’t apparently resulted in performance gains, though the evidence is muddled due to small studies and inconsistent metrics. For example, critical thinking skills do appear to grow through college, but it’s unclear whether college is entirely responsible for the growth, and the gains may have deteriorated in recent years.
The perception of competence in such skills is varied. Businesses increasingly list problem-solving, teamwork, analytical and communication skills atop their wish lists. They decry the shortage of those skills in recent graduates. Yet most students—and the chief academic officers at the colleges that educate them—believe they are well prepared for the workforce, belying employers’ observations.
The timer has expired on lip service regarding these intangible but critical skills. The future of work is more uncertain, typical job tasks are more interdisciplinary and abstract, and the work challenges are more complex. And artificial intelligence is turbocharging these evolutions.
AI and the Learning Mismatch
Getting work done is fast becoming artificial intelligence’s role. Increasingly, we’ll be making judgments about what work to pursue, how to get AI to do much of it and how to use its outputs. Work performance will be judged based on the combined contributions of humans and our AI teammates.
Wisdom, creativity and caring are the human differentiators, but only if we are individually better than the AI. I am reminded of a scene from the movie I, Robot where Will Smith’s character asks an android, “Can a robot write a symphony? Can a robot turn a canvas into a beautiful masterpiece?” The robot replies, “Can you?”
Most comparisons are between AI’s abilities and the best of human contributions, but that’s not the choice an employer has. They pick between AI and the available workers. While strong writers, coders, artists and decision-makers may perform better than current-generation AI, most people aren’t great at any of those things—and AI is getting better rapidly.
Wisdom skills are now must-haves at career launch, but they are oh so hard to teach because they rely heavily on tacit and intuitive knowledge gained via varied experiences, including failures, that allow transferable intuitions and insights. Experience takes a long time. There are college paths that are heavily experiential and emphasize judgment and interpersonal skills (e.g., business and medical programs), but most learning time in most majors is still spent gobbling up knowledge rather than learning multiuse cognitive skills. The challenge-, inquiry- and collaboration-based paradigms that foster 21st-century wisdom skills are still unusual. It’s a lot easier to teach and measure knowledge transfer than skills in judgment, and knowledge in specialized domains is still a prerequisite for most jobs—for now. Such knowledge may never be unimportant, but its primacy will take a back seat in many fields to increasingly accurate and comprehensive AI knowledge.
Are Colleges Missing the Point?
Higher education isn’t getting this message. The AI focus in higher education—if there is one at all—is on churning out more AI developers and infusing AI into instructional practice within the existing paradigm. Few are talking about how the current paradigms don’t fit the impending future.
Higher education is treating AI as a technical evolution that requires tweaks in instruction, but the bigger impact is to the very nature of work and society. It’s more akin to the dawn of electricity or computing than it is to the latest cool app, except this time the transformation will be over years instead of decades. The AI transition requires re-examination of curriculum, credentials, pedagogy and the fundamental assumptions of education. Perhaps judgment science could be a better common core course than algebra, or a degree in interpersonal interaction or critical thinking might be more important than a knowledge-domain credential.
Of course, rapid change always creates resistance, but there’s more to it than that. Even if colleges want to change, the obstacles are substantial. Online instruction doesn’t allow the same depth of instructor and peer interactions that are at the heart of experiential learning. The strong reliance on poorly paid adjunct professors makes it unlikely that classes will move away from lecture-based knowledge delivery; experiential forms of learning require more preparation. Tenured professors can thumb their noses at change. There is a threefold problem: colleges don’t seem to want to change, change is hard and many professors can refuse to change.
Ironically, it’s AI and other software technology that could grease the wheels of change. It can’t change attitudes directly, but making things easier has a way of eroding resistance, especially if the easy way is shown to be the better way.
Can AI-Powered Games Be an Answer?
Consider AI-assisted educational gaming. No, I’m not talking about addictive entertainment games, nor the typical educational game that adds engagement lipstick to a knowledge-transfer objective. The kinds of games I have worked on put professionals in plausible decision-making situations directly relevant to their jobs and force them to make choices in a dynamic, evolving environment, often based on incomplete information. In my experience, game players were not only exposed to critical choices that might be unusual in real life, but they seemed to leave game experiences with a reflective mindset about their professions. Such games have aspects of several high-impact educational practices in that they address multidisciplinary problems, often highly collaboratively, and lend themselves to forms of competency-based assessment.
Experiential learning is slow and relies on the opportunity to acquire varied experiences. Game-based learning can be a proxy for real life, accelerating the accumulation of experience. People don’t become skilled at making big-picture judgments unless they get practice making them, and there’s no longer a reason why extensive practice must wait for the first job.
AI can now be used to augment games with humanlike interactions and feedback. If a business major needs to learn negotiation, they could practice with avatars with varied personalities, tactics and objectives. If a student is studying a critical thinking skill, they could be given an AI-generated challenge and get feedback from an AI tutor. It might not be as good as what a flesh-and-blood professor can offer, but instructors are available for only a small fraction of possible learning time.
As importantly, the speed of developing games is dramatically faster with both code-free game design platforms and simple means for incorporating sophisticated chat bots. When combined with the creativity of instructors, the range of experiential learning possibilities is boundless.
Hidden in the knowledge-oriented college bias is the vibe that maybe nothing can be done about the wisdom skills short of extended on-the-job experience. Or that nothing can be done at all—that those skills are somehow innate. I’ve seen too many business leaders express that view openly. Yet there’s no scientific reason to believe that judgment, creativity and interpersonal abilities are fixed and plenty of reason to think students need these skills at graduation to land and retain good-paying jobs. Now there are new tools for experiential learning that could be the key to unlocking big-picture cognition.
But first colleges must be willing to consider greater change.