Excerpts from The Burden of Skepticism by Carl Sagan

…[W]hen we recognize some emotional vulnerability regarding a claim, that is exactly where we have to make the firmest efforts at skeptical scrutiny. That is where we can be had.

…[One letter I received] said that as an inveterate skeptic I have closed my mind to the truth. Most notably I have ignored the evidence for an Earth that is six thousand years old. Well, I haven’t ignored it; I considered the purported evidence and then rejected it. There is a difference, and this is a difference, we might say, between prejudice and postjudice. Prejudice is making a judgment before you have looked at the facts. Postjudice is making a judgment afterwards. Prejudice is terrible, in the sense that you commit injustices and you make serious mistakes. Postjudice is not terrible. You can’t be perfect of course; you may make mistakes also. But it is permissible to make a judgment after you have examined the evidence. In some circles it is even encouraged.

…If science were explained to the average person in a way that is accessible and exciting, there would be no room for pseudoscience. But there is a kind of Gresham’s Law by which in popular culture the bad science drives out the good. And for this I think we have to blame, first, the scientific community ourselves for not doing a better job of popularizing science, and second, the media, which are in this respect almost uniformly dreadful. Every newspaper in America has a daily astrology column. How many have even a weekly astronomy column? And I believe it is also the fault of the educational system. We do not teach how to think. This is a very serious failure that may even, in a world rigged with 60,000 nuclear weapons, compromise the human future.

I maintain there is much more wonder in science than in pseudoscience. And in addition, to whatever measure this term has any meaning, science has the additional virtue, and it is not an inconsiderable one, of being true.

The full article can be found here.

Solution Aversion: On the Relation Between Ideology and Motivated Disbelief by Troy H. Campbell and Aaron C. Kay

“Logically, one’s belief in the accuracy of a scientific finding should be independent of whether the findings and related consequences are undesirable. Yet, research in motivated reasoning shows that psychological motivations often direct reasoning, such that judgments of evidence are not independent of desires or motivations. Of importance, recent evidence has demonstrated that political ideology, defined as ‘an interrelated set of moral and political attitudes that possesses cognitive, affective, and motivational components,’ can similarly guide, funnel, and constrain the processing of information and alter behavior. Such motivated biases in cognition and behavior can occur for those holding conservative or liberal ideologies, depending on how the circumstances threaten or support one’s respective ideologies and intuitions.”

The full article is here.

We Are All Confident Idiots by David Dunning

“To know how skilled or unskilled you are at using the rules of grammar, for instance, you must have a good working knowledge of those rules, an impossibility among the incompetent. Poor performers—and we are all poor performers at some things—fail to see the flaws in their thinking or the answers they lack. In many cases, incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge.

…here is the real challenge: How can we learn to recognize our own ignorance and misbeliefs? To begin with, imagine that you are part of a small group that needs to make a decision about some matter of importance. Behavioral scientists often recommend that small groups appoint someone to serve as a devil’s advocate—a person whose job is to question and criticize the group’s logic. While this approach can prolong group discussions, irritate the group, and be uncomfortable, the decisions that groups ultimately reach are usually more accurate and more solidly grounded than they otherwise would be.

For individuals, the trick is to be your own devil’s advocate: to think through how your favored conclusions might be misguided; to ask yourself how you might be wrong, or how things might turn out differently from what you expect. It helps to try practicing what the psychologist Charles Lord calls ‘considering the opposite.’ To do this, I often imagine myself in a future in which I have turned out to be wrong in a decision, and then consider what the likeliest path was that led to my failure. And lastly: Seek advice. Other people may have their own misbeliefs, but a discussion can often be sufficient to rid a serious person of his or her most egregious misconceptions.

…wisdom may not involve facts and formulas so much as the ability to recognize when a limit has been reached. Stumbling through all our cognitive clutter just to recognize a true ‘I don’t know’ may not constitute failure as much as it does an enviable success, a crucial signpost that shows us we are traveling in the right direction toward the truth.”

The full essay is here.