In the Teeth of the Evidence
One of the things we hope students will learn by seeking out information in the library and online is the value of using evidence in the formulation of ideas. We tend to focus on students' most immediate need: completing an assignment that is due in the next few weeks.
One of the things we hope students will learn by seeking out information in the library and online is the value of using evidence in the formulation of ideas. We tend to focus on students' most immediate need: completing an assignment that is due in the next few weeks. But as we expose them to the tools of research and engage them in the work of evaluating and sifting their search results, we hope that they are gaining a respect for evidence as well as some familiarity with the various methods of making sense of the world that they encounter in different disciplines, from the close analytical reading they do in an English class to the examination of primary sources in history to conducting empirical research in the sciences. By doing all this, we hope that they have the tools and the inclination to make up their own minds based on the best evidence.
So I was intrigued to read a news story in the Boston Globe about research in political behavior. It turns out that people who have made up their minds are not receptive to information that doesn't support their beliefs. I tracked down some of the research mentioned in the article to see how the studies were conducted. (I'm nerdy that way.) Essentially, James Kuklinski and others found that people who held strong beliefs wouldn't let facts stand in their way. Those who were the least well informed were also the group that were the most confident in their mistaken beliefs. (I use "mistaken" here because they were factually wrong, and those misperceptions of fact conspired with their opinions about what policies should be taken.) Brendan Nyhan and Jason Reifler recently devised several experimental procedures to see how people respond to corrections in information. Not well, apparently. When people read false information and then a correction to it, they tend to dig in their heels and become even more convinced of the wrong information, a "back fire" effect that increases their insistence on misinformation being correct.
This is all very depressing. We have enough of a challenge giving students the knowhow to locate good information. I am reminded of Peter Elbow's notion of the "believing game." Rather than teach students the art of taking a text apart and arguing with it, like a dog worrying a dead squirrel, he thought there was some value in entering into ideas and doing our best to understand them from the inside rather than take a defensive position and try to disprove them as a means of understanding. I am also reminded of research done by Keith Oatley (and discussed by him here) that suggests that those who read fiction engage in a kind of simulation of reality that leads them to become more empathetic - and more open to experiences that they haven't had.
Which is interesting. Kuklinski's research is about a different kind of story. We like facts packaged in compelling rhetorical frames because the stories comfortingly jibe with our world view and our political positions. They reaffirm our beliefs and defend us from confusion. These stories are so emotionally convincing that we tend to pay much more attention to them than to the facts they contain, and when the facts conflict with the stories, we choose the stories.
Maybe instead of teaching rigorous analysis that tests ideas to see if they break, we need to put a little more emphasis on understanding them better first. A little more empathy and lot more respect for evidence could go a long way.
A PS to this little paper chase of mine - this exercise of tracing sources mentioned in a news story convinces me we need to do a much better job of making research findings accessible in every sense of the word. When you are engaged in a debate online, the links that are easily found to support your position tend to come from in the form of opinion pieces and news stories. So much of our scholarly work is locked up behind paywalls that even finding research referred to in these opinion and news sources takes a lot of detective skill and patience, and when you find them you can't provide links that work. If we want our work to matter, if we want the evidence we gather to make a difference, we need to think about making it more accessible, not just in terms of readability, but findabilty. Kudos to the authors who have made their work open access, and kudos to those publishers and libraries who help.
Read more by
Inside Higher Ed’s Blog U
What Others Are Reading