Excerpt from How Religious Fundamentalism Hijacks the Brain by Bobby Azarian

There are various types of viruses and parasites, and viruses are themselves parasites. While biological viruses are infectious agents that self-replicate inside living cells, computer viruses are destructive pieces of code that insert themselves into existing programs and change the actions of those programs. One particularly nasty type of computer virus that relies on humans for replication, known as a “Trojan horse,” disguises itself as something useful or interesting in order to persuade individuals to download and spread it. Similarly, a harmful ideology disguises itself as something beneficial in order to insert itself into the brain of an individual, so that it can instruct them to behave in ways that transmit the mental virus to others. The ability for parasites to modify the behavior of hosts in ways that increase their own “fitness” (i.e., their ability to survive and reproduce) while hurting the fitness of the host, is known as “parasitic manipulation.”

One particularly intriguing example of parasitic manipulation occurs when a hairworm infects a grasshopper and seizes its brain in order to survive and self-replicate. This parasite influences its behavior by inserting specific proteins into its brain. Essentially, infected grasshoppers become slaves for parasitic, self-copying machinery.

In much the same way, Christian fundamentalism is a parasitic ideology that inserts itself into brains, commanding individuals to act and think in a certain way—a rigid way that is intolerant to competing ideas. We know that religious fundamentalism is strongly correlated with what psychologists and neuroscientists call “magical thinking,” which refers to making connections between actions and events when no such connections exist in reality. Without magical thinking, the religion can’t survive, nor can it replicate itself. Another cognitive impairment we see in those with extreme religious views is a greater reliance on intuitive rather than reflective or analytic thought, which frequently leads to incorrect assumptions since intuition is often deceiving or overly simplistic.

We also know that in the United States, Christian fundamentalism is linked to science denial. Since science is nothing more than a method of determining truth using empirical measurement and hypothesis testing, denial of science equates to the denial of objective truth and tangible evidence. In other words, the denial of reality. Not only does fundamentalism promote delusional thinking, it also discourages followers from exposing themselves to any different ideas, which acts to protect the delusions that are essential to the ideology.

If we want to inoculate society against the harms of fundamentalist ideologies, we must start thinking differently about how they function in the brain. An ideology with a tendency to harm its host in an effort to self-replicate gives it all the properties of a parasitic virus, and defending against such a belief system requires understanding it as one. When a fundamentalist ideology inhabits a host brain, the organism’s mind is no longer fully in control. The ideology is controlling its behavior and reasoning processes to propagate itself and sustain its survival. This analogy should inform how we approach efforts that attempt to reverse brainwashing and restore cognitive function in areas like analytic reasoning and problem-solving.

The full article is here.

Solution Aversion: On the Relation Between Ideology and Motivated Disbelief by Troy H. Campbell and Aaron C. Kay

“Logically, one’s belief in the accuracy of a scientific finding should be independent of whether the findings and related consequences are undesirable. Yet, research in motivated reasoning shows that psychological motivations often direct reasoning, such that judgments of evidence are not independent of desires or motivations. Of importance, recent evidence has demonstrated that political ideology, defined as ‘an interrelated set of moral and political attitudes that possesses cognitive, affective, and motivational components,’ can similarly guide, funnel, and constrain the processing of information and alter behavior. Such motivated biases in cognition and behavior can occur for those holding conservative or liberal ideologies, depending on how the circumstances threaten or support one’s respective ideologies and intuitions.”

The full article is here.

We Are All Confident Idiots by David Dunning

“To know how skilled or unskilled you are at using the rules of grammar, for instance, you must have a good working knowledge of those rules, an impossibility among the incompetent. Poor performers—and we are all poor performers at some things—fail to see the flaws in their thinking or the answers they lack. In many cases, incompetence does not leave people disoriented, perplexed, or cautious. Instead, the incompetent are often blessed with an inappropriate confidence, buoyed by something that feels to them like knowledge.

…here is the real challenge: How can we learn to recognize our own ignorance and misbeliefs? To begin with, imagine that you are part of a small group that needs to make a decision about some matter of importance. Behavioral scientists often recommend that small groups appoint someone to serve as a devil’s advocate—a person whose job is to question and criticize the group’s logic. While this approach can prolong group discussions, irritate the group, and be uncomfortable, the decisions that groups ultimately reach are usually more accurate and more solidly grounded than they otherwise would be.

For individuals, the trick is to be your own devil’s advocate: to think through how your favored conclusions might be misguided; to ask yourself how you might be wrong, or how things might turn out differently from what you expect. It helps to try practicing what the psychologist Charles Lord calls ‘considering the opposite.’ To do this, I often imagine myself in a future in which I have turned out to be wrong in a decision, and then consider what the likeliest path was that led to my failure. And lastly: Seek advice. Other people may have their own misbeliefs, but a discussion can often be sufficient to rid a serious person of his or her most egregious misconceptions.

…wisdom may not involve facts and formulas so much as the ability to recognize when a limit has been reached. Stumbling through all our cognitive clutter just to recognize a true ‘I don’t know’ may not constitute failure as much as it does an enviable success, a crucial signpost that shows us we are traveling in the right direction toward the truth.”

The full essay is here.