You have /5 articles left.
Sign up for a free account or log in.

It is doubtful that any social science book in recent years has gotten more sustained attention from academia and the intellectual public than Daniel Kahneman’s prize-winning Thinking: Fast & Slow (2011). Michael Lewis’ 2016 book on Kahneman (The Undoing Project) has further stimulated popular interest. Kahneman’s Thinking traverses the social sciences while concentrating on psychology and economics; throughout, it is very much a book on decision-making, both professional and personal—we’re frequently and similarly irrational in both.

I’ve taken to using Thinking in a doctoral core course (Social Analysis of Education) in which most students are professional education practitioners. Kahneman’s book itself offers only a handful of examples from education studies but thoughtful education students, practitioners, and scholars should eagerly conjure up repeated examples. And though allowing for cultural and other contextual variation, the analysis rarely if ever applies just to the US, instead dealing mostly with universal human behavioral and decision-making tendencies.

Drawing on his entire career doing research and studying the research of others, Kahneman’s central argument is that most of our thinking is fast--and largely psychologically driven irrationally. This contrasts with the rationalism of economics and the objective thinking that is our proud professional model. In reality, we repeatedly and unwittingly make decisions contrary to our (or our institution’s) objective self-interest. We are insufficiently motivated to pause and engage in slower, more rational thinking (though slower thinking too is susceptible to irrational inclinations). Instead we often seek immediate gratification, cognitive ease, and simple explanations. We succumb repeatedly to “WYSIATI”—what you see is all there is.

The author is wonderfully witty and playful, the book is full of fun examples, often lightly mocking readers and humankind generally while fairly cackling about the irrational decision-making process of experts. Before a tennis match, more people respond “yes” when asked whether Federer will win if he loses the first set than when simply asked whether he will win. Highly professional judges rule more favorably on defendants just after lunch than just before. Teachers vary in grading and comments on papers by when they read them, not just with the papers per se.

But aren’t we, the most highly educated professionals and scholars, at least largely immune from such irrationality, usually deciding through objective numbers and other evidence? Scores of documented examples about business executives, military leaders, wine-tasters, and even a few on teachers and deans, suggest otherwise. Moreover, experiment subjects are often college students, our smiles broadening when they’re Ivy League. But, entertainment aside, the degree of irrationality in our decision-making is of course potentially a very disheartening message and indeed Kahneman is often depicted as pessimist and skeptic.

At the same time, however, Kahneman seeks—and finds—helpful practical advice that can (within bounds) improve our decision-making--both personal and professional. Many of the practical applications simply involve awareness and consequent vigilance in fending off common vulnerabilities in our fast thinking. We can sensitize ourselves to recognizing when a shift to slow thinking is indicated, and then probing whether, for example, we are over-valuing what we own even though superior value is available in exchange (“loss aversion”), inflating the weight of recent and colorful examples, substituting easier questions for the tough ones really being asked, succumbing to the power of a few neat stories yielding pleasant findings, overweighting vivid detail while ignoring base rates (e.g., a brilliant college grad is in fact more likely to become a businessperson than a physicist, even if his crazy hair draws double takes), and widely mistaking proven and striking correlation with causation. We can thus better protect ourselves from irrationality and manipulation.

On the other hand, as teachers and policymakers we are not just susceptible manipulees, but also manipulators. Unpleasant terms to be sure—and we don’t like to equate ourselves to ad reps--but doesn’t persuasion lie at the core of what education and all leaders do? We ourselves can frame choices in more attractive ways, appealing to vivid and recent examples while downplaying base rates, invoking “illustrative” and psychologically satisfying stories, and so forth.

Other uses of Kahneman are more normatively appealing. In addition to self-awareness focused on our weak natural inclinations, try collaborative work with a respected colleague (researcher or policymaker) who is normally skeptical of your findings or decisions. Kahneman recounts a gratifying experience of working with a colleague who generally believed more in skilled judgment where Kahneman might bet on algorithms. Together they could identify their contrasting definitions and assumptions as well as the circumstances under which subject matter tends to benefit from expert judgment versus subject matter where algorithms prove better predictors.

Identified natural weaknesses are dauntingly many, but from reflection on almost any one, we logically can be alert to others as well. Consider one such package of weaknesses, all applicable to how we choose college presidents and offer them huge salaries and benefit packages (a tendency likely gaining force internationally where reform gives increased decision-making power to institutions and their leaders). We need not dismiss the economic rationality explanation: abilities and experience increase the odds of successful performance; sociological explanations might note how status counts and how universities are mimicking the behavior of business organizations.

Kahneman, however, would surely have us reflect on the likely errors we commonly make in the hiring and rewarding process, including many that conflate cause and effect. Candidate X’s CV and cover letter claim huge achievements from the leader’s short time at Jane Doe State. These achievements appear likely “remarkable,” maybe “unprecedented,” and ‘backed’ by big ‘performance indicator numbers as well as witnesses (both on-site and remote) who readily and with honest conviction cite a big grant received, improved student learning outcomes, etc. Even just the very launching of large, eye-catching programs may be persuasive given “optimism bias” and “over-expectation”, the “planning fallacy” involves leaders and supporters greatly over-estimating both the likelihood and the weight of successes from fresh initiatives. The candidate’s articulated story of what was done and what “resulted” (ensued) is logical and plausible. Yet all the while the happy record might well be more the result of other causes, many unseen and unknown, or just random variation with high performance likely to regress toward an established average. Exactly the story with belief in and hiring of CEOs at Fortune 500 companies, as also the story of mythologized stock-pickers, mutual fund managers, and football coaches. “Outcome bias” leads us to over-attribute a positive present to actions recently taken; sometimes it’s better to review how a college president made decisions than to go basically by what ensued. Nobody says that performances don’t vary or don’t matter in outcomes, but that we should be cautious about happy cause and effect evaluation.

Mindful of the phony industry of how-to claims based on selected success stories, I nonetheless heartily recommend Thinking to education scholars, practitioners, and policymakers, and do so with optimism and high expectations for improvement in their professional as well as personal decision-making, speaking from my own one very vivid case: I’ve never been as rewarded by any other social science book I’ve read.

Next Story

Written By