You have /5 articles left.
Sign up for a free account or log in.

On numerous occasions I’ve been asked, “How do you know that? I read on the internet …” And I’ve heard and overheard, “Did you know (fill in the blank with some crazy conspiracy theory)?” Worse, some politicians want to dictate and police what can be taught as fact and what people are allowed to know.

To say I’m frustrated or mad would be an understatement. If someone could read the cartoon thought bubble hovering above my head, it would read, “What the ever-loving %&*#!” And now, I ask my higher ed colleagues, “What are we doing? What. Are. We. Doing?” If nothing else, higher education must teach people how to know shit from Shinola, fact from fiction, and truth from lore.

Yep. How we know and why we should know something is as important as what we know. It comes down to the discipline people often target for elimination when there is a budget deficit—philosophy (epistemology, ethics, ontology, semiotics, etc.). Perhaps more pointedly, what essential tools, skills and questions can we arm our students and graduates with to question assertions of truth and fact?

I’ve found that people no longer ask fundamental questions before they believe something. They’ve become incurious, reactionary and paranoid. They willingly participate in propagating lies. The best I’ve been able to do when confronted by people peddling falsehoods and fear-inducing nonsense is to ask a series of questions and explain a couple of concepts (if the person is willing to engage in the conversation).

Are you getting information from a trustworthy source? How do you know it is reliable?

For example, if you’re getting your information from the internet, what kind of site is it—.com, .net, .org, .edu? An accredited entity with a mission to impart knowledge through research and investigation (usually a .edu or .org) is likely more trustworthy than a .com whose purpose is entertainment, providing partisan political commentary or trying to sell you something. Where did the information originate? What was the process of gathering that information? Facts are not determined by hearsay. Facts are known through documentation, eyewitnesses and primary sources.

Who is speaking? Why? What’s their objective and point of view?

People are influenced by their lived experiences, belief systems, needs and desires. Asking what motivates someone to speak is as important as what they say. Should you trust someone with ulterior motives to subjugate, harm, discriminate and otherwise treat others unjustly? No, because that would be morally corrupt and reprehensible. Trustworthy people living in a civil society care about basic human dignity.

What is an expert? Who can be an expert?

Someone isn’t an expert by self-proclamation (there are other words for that—charlatan, snake oil salesman, narcissist). Expertise is earned, practiced and recognized by other experts and scholars. One becomes an expert through study and research employing rigor, discipline and dogged practice. The title is acquired through academic degrees, the creation of significant bodies of work reviewed and vetted by peers and known experts, and honors bestowed by reputable entities. For example, would you trust an on-air personality with a bachelor’s degree who has been sued for defamation or a Nobel Prize–winning infectious disease scientist to advise you on immunizations? Trusting and believing the on-air personality over the scientist is just willful ignorance. While we live in a democratic society where people are free to say what they will within the confines of the law, it doesn’t mean everyone is an expert and should be trusted to impart facts.

While every discipline teaches research methods and requires capstone/thesis projects, practicums, dissertations, residencies and internships, we must do much more to teach people how to assess information with skepticism and openness critically. Should first-year seminars focus on the issue of fact versus fiction through a series of exploration and readings? I’d start with Carl Sagan’s chapter “The Fine Art of Baloney Detection” from his final book, The Demon-Haunted World: Science as a Candle in the Dark (1996). Perhaps we’d include a variety of disciplinary approaches, such as Pamela Meyer’s Liespotting: Proven Techniques to Detect Deception (2010), Orson Welles’s F Is for Fake (1973) and Guy Kawasaki’s Reality Check (2011).

Here are Sagan’s nine tools:

  1. Wherever possible there must be independent confirmation of the “facts.”
  2. Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
  3. Arguments from authority carry little weight—“authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
  4. Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses,” has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
  5. Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
  6. Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
  7. If there’s a chain of argument, every link in the chain must work (including the premise)—not just most of them.
  8. Occam’s razor. This convenient rule of thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.

Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our universe and everything in it is just an elementary particle—an electron, say—in a much bigger cosmos. But if we can never acquire information from outside our universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.

Next Story