On NPR this afternoon, I happened to catch the end of an interview with Michael Specter, a staff writer for the New Yorker magazine. Specter, who seems to make a living in the interstices between scientific knowledge and American public opinion, was speaking  about his recent article on the subject of in vitro (artificial, grown in the lab) meat. At the end of the piece, though, the host made reference to one of his books, titled "Denialism: How Irrational Thinking Hinders Scientific Progress, Harms the Planet, and Threatens Our Lives ".
Just hearing that title got me thinking -- denialists evangelize against pretty much everything on which higher education depends, but specific denialisms are funded (directly or indirectly) by some of the same large corporate interests upon which colleges and universities must increasingly depend for research (or even operational) funding. Coming out directly and publicly against the US Chamber of Commerce on environmental issues (for example) isn't going to help any school's development prospects, even when addressing firms and foundations other than the handful that control the Chamber.
But engaging denialism on broad, abstract grounds -- entering into an epistemelogical debate, if such can be done without losing the audience in jargon or wrapping ourselves around some post-modern axle -- might be possible at an acceptable level of risk. After all, one of the reasons corporations and foundations support higher ed is that they like to wrap themselves in the mantle of knowledge. Defending that mantle against all comers, as a result, should be considered unobjectionable behavior.
Not that, I think, Specter's book will provide much of a road map. If the publisher's blurb on Amazon  is to be believed, Specter comes out unabashedly in favor of some technologies (genetically manipulated crops, unlimited animal testing, etc.) that may not provide advocates of scientific knowledge with their strongest public suasion.
But the brothers Hoofnagle and an anonymous associate provide an alternative map. Their denialism blog  is full of information on which to base an anti-denialism attack strategy. I particularly like their list of the six ground rules for being a denialist:
1. Allege that there's a conspiracy. Claim that scientific consensus has arisen through collusion rather than the accumulation of evidence.
2. Use fake experts to support your story. "Denial always starts with a cadre of pseudo-experts with some credentials that create a facade of credibility," says Seth Kalichman of the University of Connecticut.
3. Cherry-pick the evidence: trumpet whatever appears to support your case and ignore or rubbish the rest. Carry on trotting out supportive evidence even after it has been discredited.
4. Create impossible standards for your opponents. Claim that the existing evidence is not good enough and demand more. If your opponent comes up with evidence you have demanded, move the goalposts.
5. Use logical fallacies. Hitler opposed smoking, so anti-smoking measures are Nazi. Deliberately misrepresent the scientific consensus and then knock down your straw man.
6. Manufacture doubt. Falsely portray scientists as so divided that basing policy on their advice would be premature. Insist "both sides" must be heard and cry censorship when "dissenting" arguments or experts are rejected.
To my mind, this sort of high-level critique, expressed in non-threatening verbiage, feels about right. (Of course, since two of the three bloggers are MDs, perhaps they're used to discussing scientific truths in minimally threatening terms.)
Can it work? I don't know. Could it work? Perhaps, if done well. Can we get colleges, universities, perhaps medical schools to engage in defending their epistemelogical turf? I would like to think so.
My fear, however, was well summed up by Agent K (Tommy Lee Jones) in the movie Men in Black: "A person is smart. People are dumb, panicky, dangerous animals, and you know it." The truth of that statement is proven by the very fact that denialism works. Consistently.