You have /5 articles left.
Sign up for a free account or log in.
Istockphoto.com/lmrsquid
In an earlier article, we introduced the Academic Unit Diagnostic Tool, or AUDiT. If you have used it to assess your home department and have identified problem areas, the next step is to consider points of potential intervention and reform.
Unit members may be reluctant to support such measures, however, if they don't believe problems exist in the first place. Rationalization and denial of this nature is at the heart of many academic difficulties, and such cognitive biases can be persistent and intractable. They are not new -- behavioral researchers have been studying such failures of logic for decades. We see these biases at work in many of the situations that characterize troubled academic units and in people’s reactions (or their failing to react) to those problems.
In this piece, we describe some of the most common and problematic cognitive biases and how they pose challenges to healthy academic units. In a follow-up essay, we will examine ways to come to grips with such biases, engage in intervention and repair, and foster more open and informed discussions about a unit’s strengths and shortcomings.
What Are Cognitive Biases?
Cognitive biases are errors in thinking that are found throughout human interactions. They can drive us to assume the best in ourselves and the worst in other people, to retain information that reinforces our existing beliefs and discount or ignore information that does not, to judge ourselves by our intent and others only by their actions. Such biases can hamper our interactions with others and increase conflict and interpersonal tensions. Their effects are so quick, we often do not even realize anything has happened. Working to identify and counteract these flaws in our own thinking, and learning to recognize them in others, can improve relationships in our working environments.
When one observes these kinds of errors in logic and cognition from a distance, it can be easy to identify mistakes. Some might seem so obvious you can quickly convince yourself that you would never fall prey to them, but that in itself is a known bias born of overconfidence. The truth is that all of us are susceptible -- no matter how self-aware we might feel, no matter how intelligent or well educated.
In fact, experience tells us that highly educated people can be especially vulnerable to cognitive missteps: they may have robust skills for rationalizing why particular situations can be considered exceptions. Cognitive biases affect people of all races, identity positions and cultures. They affect people with bad intentions and good ones, and while they are especially pernicious when people are tired or distracted, they come into play even when they are not. It takes hard work and dedication to forming good habits to guard against their effects, and if you are committed to overcoming them, accepting that you are vulnerable to them is an important first step.
Drawing upon the fields of social and behavioral psychology, we examine cognitive biases below through the lens of academe, distilling the known traits of several that are most common -- and most counterproductive to a vibrant academic unit culture. While we’ve listed them as separate examples to make them easier to grasp, we also hope to make it clear that they are not entirely discrete phenomena. In many ordinary circumstances, they operate in concert.
Fundamental attribution error. This cognitive bias describes our tendency to credit ourselves for our successes and to blame external environmental factors for failures, while doing the opposite for others. It is illustrated in automobile accidents, where we often feel that an accident that someone else caused was due to that person's ineptitude as a driver, while our own mishaps were the result of bad luck, poor road layout, adverse weather conditions, the actions of other drivers or confusing signage.
In academic contexts, one can see this tendency manifested, for example, in data collection and research outcomes. When it’s your study that didn’t yield the results you were hoping for, it was simply that the “Data gods were not feeling benevolent that week.” When it is your irritating coworker’s project, it is “probably because his methods were sloppy” or “her analysis was poorly done.”
Sinister attribution bias. When we exhibit this bias, we allow our personal feelings about another person to shape our assumptions about the reasons for their actions: we attribute less admirable motives to those whom we do not like and excuse or rationalize the conduct of those whom we do. For example, if you don't like Alex much and you are partial to Louise, when Alex is late for a meeting or the class he has to teach, you imagine him dismissively looking at the clock and shrugging his shoulders. But when Louise is late, you are more likely to envisage her in heavy traffic or dealing with a pressing matter.
Confirmation bias. This is one of the most common cognitive errors. It is the instinct to seek or acknowledge only the segments of information that support your already-existing beliefs and to parse or reject data that goes against them. So, you might remember previous hires from prestigious colleges (perhaps like your own!) as being among the best hiring decisions the unit has made, arguing that the same institutions should also be emphasized in future hires. Meanwhile, you forget the several unsuccessful hires from those kinds of institutions and neglect some outstanding hires from less prestigious programs.
Anchoring bias. Our first impressions are often the easiest to reaffirm and some of the hardest to readjust. We tend to anchor on the initial information presented during a conversation. Anchoring bias and confirmation bias often go hand in hand. Faculty members might remember for years a single comment made in a faculty meeting and project a colleague’s future behavior based on it. Anchoring defines negotiations by shaping expectations and ranges. If you’re getting ready to negotiate a job offer or a promotion, learning more about anchoring is well worth the time.
The Dunning-Kruger effect. As Bertrand Russell once said, “The trouble with the world is that the stupid are cocksure while the intelligent are full of doubt.” The Dunning-Kruger effect is observed when people who have little expertise or ability in a particular area assess their proficiency as being greater than it is. A major occupational hazard for academics is when people who are experts in one field believe they are justified in speaking with authority on other topics, whether they possess the requisite expertise or not. Conversely, other academics are reflexively insecure and doubtful about their abilities, needing reassurance or recognition far beyond what other colleagues require.
Motivated blindness. Many of us have encountered a case of potential motivated blindness -- the tendency to overlook bad news when it suits us or fail to notice unethical behavior when it is not in our interests to do so. This can be especially destructive in academe if, for example, a co-author is planning to selectively limit the data shown in a joint article. Doing so makes the conclusions stronger and more convincing, and while the other author knows it isn’t telling the whole picture, both really want the manuscript published. Moreover, no one wants to start an argument with a colleague, so the other author says nothing. Other manifestations of this fallacy can inhibit the kinds of frank and honest discussions a unit needs to have about its issues.
Egocentrism bias. One final type of cognitive bias that can afflict academic units is the tendency to think your position is right, so naturally others will agree with you. This assumption can leave one unprepared for honest differences of opinion or (combined with other fallacies cited above) prompt feelings that when people disagree, it must be for questionable motives. Egocentrism often affects the judgments of faculty members toward administrators, or vice versa, and can be the source of serious conflict and misunderstandings.
Protecting Yourself From Cognitive Biases
Arming yourself with knowledge can help you to recognize cognitive biases in yourself or in others, and to begin to work against their effects. One of the simplest and most straightforward ways to avoid cognitive biases is to consciously train yourself to ask questions and challenge your own assumptions and those made by others. Sometimes that means surrounding yourself with people you know will challenge you. Having someone on your team who is adept at playing devil’s advocate can help you make stronger decisions, because it prompts you to consider a wider range of factors and possibilities.
It is preferable to ask more questions to confirm understanding than to simply assume the information that you have is correct. The more information you acquire and the more options you consider, the better equipped you will be to identify and choose the path you should take, rather than the one you want to take.
In the context of a troubled department, members may react in diverse ways to avoid having to accept responsibility: attributing worse motives to others than to themselves, seeing in the actions of others unprofessional conduct but not recognizing it in themselves, selectively citing examples to make problems look more serious (or less serious) than they are and so on.
One tendency that often leads people into trouble is to assume that there are always demonstrably right and wrong choices to make and outcomes to reach in dealing with difficult situations. Unfortunately, the world is rarely so simple, and many difficult situations have no clear resolution. A more useful approach is to think in terms of better versus worse as opposed to right and wrong. Seek interventions that move things further along the spectrum toward the better rather than seeking an ideal.
Understanding how cognitive biases can affect you personally is a continuing process of self-evaluation and assessment. While critical self-reflection can help us to recognize these processes at work, they never go away completely. Complacency -- thinking you are immune to these effects -- can itself lure you into errors of cognition. Protecting yourself from bias requires an open mind, curiosity and constant self-awareness.