Understanding and Navigating Cognitive Biases: Part 2

Sebastian Wraight, Nicholas C. Burbules, C. K. Gunsalus and Robert A. Easter explore how to reduce the problems cognitive biases can produce in your academic department.

October 24, 2018
 
 
Istockphoto.com/LMRSquid

What do you do in a departmental discussion when someone says the equivalent of “I don’t care about the facts -- my mind is made up,” or when a heated conversation just doesn’t seem to be about what it’s about?

In previous articles, we introduced the Academic Unit Diagnostic Tool (AUDiT) and explored the types of cognitive biases -- common patterns of mental errors -- that can interfere with an honest, frank assessment of the status of one’s own department. Denial that there is a problem is itself a serious problem. Here we continue to synthesize what is known about these biases and explore how to recognize and reduce the interferences they produce -- which we are all subject to -- to foster more open and constructive conversations about a unit’s strengths and shortcomings.

Our focus is primarily on the challenges of interacting with department members, including the department’s leaders, who may be reluctant to acknowledge causes of dysfunction in the unit or even that dysfunction exists at all. Sometimes such differences of perspective reflect honest disagreements about the facts or how heavily to weight them in assessing the unit.

Other times, however, the root of the differences may instead be in the form of denial through cognitive biases. That denial can manifest in any number of ways: as a department head, you might find yourself facing a host of faculty members pointing fingers at one another -- or they may all be pointing at you. Combatants may be furiously engaged in rationalizing their own behavior because “so-and-so did something else just as bad!” or “I had to stand up for principle!”

It can be difficult to recognize cognitive biases in action, as they can be subtly subversive, without our even realizing it. They often arise from strongly held beliefs and from the natural human tendency of operating with an egocentric bias, one that leads us to see the world through our own filters and perceptions. These biases matter both because they can be a source of departmental dysfunction and because they can interfere with identifying and acting upon the problems a unit faces. Some methods that can be useful in coming to grips with these issues include the following.

Ask, Don’t Tell

One of the most effective means of revealing and overcoming cognitive biases is to ask questions. The challenge is how to pose questions constructively, in a spirit of inquiry -- and not to deploy them as weapons to label, humiliate or vanquish others. Of course, since one of the markers for cognitive biases in action is an unwillingness to accept questions, it can take some practice and tact to cultivate the mind-set and the skill required to ask questions that advance -- not escalate -- any complex discussion.

For many reasons, people often experience self-doubt and hesitation when it comes to asking questions. Outside certain kinds of formal settings where it is expected (an academic presentation, let’s say), there are social norms against skeptical questioning, which is often seen as overly aggressive. In politics, questions -- for example, from reporters -- are often characterized as “disrespectful” or “hostile,” especially when they seek out uncomfortable or inconvenient facts. We see the same dynamics in academic departments.

Given these larger social dynamics in our culture, people sometimes view questions as power plays, acts of dominance or microaggressions. The intention -- or perceived intention -- behind the question, the context and the relative positions and status of the questioner and questioned can all reinforce those perceptions. Thus, we often see -- even in academic settings that are supposed to be about the free and open exchange of ideas -- a certain laissez-faire tolerance toward the views and opinions of others, even when we believe them to be seriously misguided, or even dangerous.

More prosaically, asking questions of others can be awkward, whether because of concerns about looking uninformed or foolish, an aversion to pestering others, or not wanting to appear to disagree. Consider the alternative, though: without asking questions to confirm information, intentions and events, we tend to make assumptions, which often leads to trouble. As we often say in our project group when trying to work through complex issues, “Mind-reading is a highly imperfect form of communication.”

So, if questions framed poorly or used with malintent are counterproductive, what kinds of questions invite the type of self-reflection that can begin to uncover confirmation bias, self-deception or an unwillingness to consider the possibility of being wrong? And, if we are to engage others in this fashion, what does that commit us to, in terms of reciprocity?

As discussed in our previous article on cognitive biases, characterizations of the positions of others as right or wrong, or correct and incorrect, are powerfully charged and often encourage defensiveness that hampers productive discussions. As a result, one of the least effective approaches is to begin any exchange with the expectation of convincing the other person, or persons, that they are “wrong” and you are “right” -- even when (or especially when) you strongly believe that they are “wrong” and you are “right.”

Instead, try to elicit shared goals and interests, if possible, and then work through “better” and “worse” approaches for reaching a particular goal, defining stages of progress along a spectrum. If you can bring others far enough along to begin to see the possibility of flaws or holes in their positions -- or show that you are willing to do the same -- they may make the rest of that journey themselves by starting to consider other options. And they will be even more likely to do so if they can do it without appearing to “lose.”

What Are the Roles?

How one engages with other people about difficult issues depends on the respective roles of each person in the interaction. How you approach a subordinate will be very different from how you approach a supervisor, a peer or a boss. Are you dealing with a group of people or just one? What is the history of your interaction with this person? Such factors will often influence how people perceive your questions, no matter how carefully you word them.

Leaders are as susceptible to cognitive biases as anyone, and when they (we) fail to acknowledge and counteract those biases, the consequences are often proportionally larger. For example, if you open a topic in a meeting by assuming that most people in attendance agree with you without bothering to ask questions to confirm that assumption, you may be actively fostering discontent and conflict within your unit.

As a leader, establishing a culture of encouraging questions can help to inoculate your unit against many of the most common and pernicious cognitive biases. Gathering more information and additional perspectives is almost never a bad thing in preparing to make decisions, and if that is the tone you set as the leader, then that is the model that the people around you will be more likely to adopt. Take care that your language does not exacerbate ideological or other divisions. When discussing how to improve and repair dysfunctional units, articulate what we can do together to move things along that spectrum toward “better” performance, rather than blaming or focusing on the actions of individuals.

Another useful tactic is to incorporate “third point” perspectives, so that the lens of attention is not on any one person or group. If a subunit within your department has an inefficient or ineffective process, demanding an explanation for “how they could do something so wrong!” is likely to elicit defensive reactions, increase reluctance to change and hinder acknowledgment that change is needed. Pointing to external data, a report or even an environmental or institutional threat (e.g., competition from another unit), and using that to appeal to common goals can reinforce that this is a process among colleagues with shared interests.

An AUDiT review can serve this purpose by surfacing shared concerns that might otherwise be left unspoken, or citing data that highlight objective conditions that are not in themselves subject to dispute -- even if the choice of what to do about them might be.

Another effective approach can be to provide an example of another institution’s methods or system and ask people to explore their strengths and weaknesses. In some cases, the act of simply explaining such differences is powerful enough to demonstrate their benefits and drawbacks, and because this is (initially, at least) talking about others, it raises potential issues in a manner that doesn’t point fingers at anyone in particular or assign blame internally. Gathering data and information on how other institutions handle issues can also help illuminate local habits rooted in “that’s how we’ve always done it” mentalities. In academe, we frequently encounter a deep reluctance to learn from the experiences of others; we often assume that our own particular challenges are unique. Our surveys and discussions with others using the AUDiT dashboard point out quite the opposite: troubled units encounter the same challenges again and again.

Sometimes You Must Be Blunt

Of course, you can take all the measures in the world to be tactful and nonconfrontational in how you approach these issues and still find that the message is not getting through. Cognitive biases can be entrenched and difficult to undo. On such occasions, you may have to be more straightforward: remember that it is possible to be direct without being rude or cruel. Take the time to think about precisely what you want to say, and the points you want to convey. Make sure to have data or materials with you to support your conclusions and ideas concretely, so that they cannot be dismissed as misinformed opinion.

The ultimate goal in most of these situations is to get people to step outside their box and see things through a different perspective, even if only briefly. A narrow perspective is one of the most common causes of virtually every kind of cognitive bias, and those biases often go on to subsequently strengthen our conviction that the only correct perspective is our own. That is a vicious and damaging feedback cycle that can be challenging to interrupt. At times, all that can be done, at least in the short term, is to draw attention to a point of contention, an alternative option or a means of improvement. This first step of realizing that the status quo is not inevitable can be a starting point for the investigation of further change.

Whether you are dealing with just one especially intractable individual, or a larger group of people who are misinformed, the idea is to get everyone moving in the same direction toward a shared goal of “doing better.” The problems that a unit leader faces in grappling with dysfunction can be myriad and daunting, so it is crucial to avoid the trappings of trying to sort out who is “right” and who is “wrong.” These situations are rarely cut-and-dried, and even if the lines of division are fairly clear, pointing that out usually isn’t productive and can serve to deepen conflict.

The kind of leader who is most successful in these situations is one who works to maintain a “big-picture” perspective in discussions and who can project the idea that everyone is presumed to share the goal of moving the department back to a more vibrant and productive state. Doing so involves overcoming divisions of “us versus them”; dispelling the idea that throughout all the chaos, somebody was at fault; and finding a common interest for everyone to strive toward. When the conversations taking place start to become more about what “we” can do to arrive at a place that is better for all of us -- rather than what, say, “he” or “she” needs to do to stop mucking it up for everyone else -- then you will know you are on a better track forward.

Bio

Sebastian Wraight is a project associate at the National Center for Professional and Research Ethics (NCPRE) at the Coordinated Sciences Laboratory at the University of Illinois at Urbana Champaign. Nicholas C. Burbules is the Gutgsell Professor in the department of educational policy, organization and leadership at the university. C. K. Gunsalus is the director of NCPRE, professor emerita of business and research professor at the Coordinated Sciences Laboratory. Robert A. Easter is president emeritus and dean of agriculture, consumer and environmental sciences emeritus at the University of Illinois.

Be the first to know.
Get our free daily newsletter.

 

Back to Top