The Fallacies of Faith

With remarkable ease, we form and sustain false beliefs. Led by our preconceptions, overconfident, persuaded by vivid anecdotes, perceiving correlations and control even where none may exist, we construct our social beliefs and then influence others to confirm them …

[I]f anything, laboratory procedures overestimate our intuitive powers. The experiments usually present people with clear evidence and warn them that their reasoning ability is being tested. Seldom does real life say to us: ‘Here is some evidence. Now put on your intellectual Sunday best and answer these questions.’ … to cope with reality, we simplify it.
 [1]Note 1. David Myers (2013), Social Psychology pp. 114-115, "Social Beliefs and Judgments."
The most consistent finding of psychology is that human thought and behavior are predictably irrational. No human, regardless of that human's intelligence, consistently forms accurate beliefs. Many people like to believe that they objectively and critically examine the information that they gain from experience, using it to form true beliefs. Without knowing it, though, humans usually form their beliefs for social reasons and actively deny the possibility of being wrong. The human mind is not optimized to discover and believe truth. Instead, it tends to delude itself.

Common mistakes in reasoning due to human psychology are called cognitive biases, which are formally defined as "the systematic patterns of deviation from norm or rationality in judgment, whereby inferences about other people and situations may be drawn in an illogical fashion." The biggest problems with cognitive biases are how difficult they all are to overcome, how pervasively they affect everyone's reasoning, and how each person tends to ignore their influence on her reasoning.

Below, I give an overview of cognitive biases and review research on some of the most important ones which mislead everyone into self-delusion. All of these should reduce your confidence in your own beliefs, and lead you to consider the possibility that many of them are unjustified or wrong. [2]Note 2. Most of the material in this section was adapted from an article that I posted online in May 2017: "Don’t Trust Your Beliefs: Our Irrational Minds, Part 1."

1. Hundreds of cognitive biases distort everyone's thinking processes, making everyone think inaccurately and unreliably.

Hundreds of different biases make all humans irrational, as shown in Wikipedia’s list of 185 cognitive biases. The image below is the most humbling that I have ever seen. Every line describes one distinct way that we often judge things incorrectly:


Here is a taste of the biases described by those lines:


All of those biases can be categorized based on what causes them, and the specifics of how they delude everyone: [3]Note 3. Also check out this poster including the name and definition of every bias in the 2017 "codex."


Yet it is not enough to simply show off how many biases there are. One must also realize how frequently they occur, and how much they tend to distort everyone's judgments. Fortunately, many researchers have examined the influence of cognitive biases.

2. Many overconfidence biases cause everyone to overestimate the probability that they are right and underestimate the probability that they are wrong.

Because the causes of bias are usually unconscious, people rarely notice them. Consequently, confidence about one's knowledge is dangerously misleading. For example, the "illusory superiority effect" is "a cognitive bias whereby a person overestimates his or her own qualities and abilities, in relation to the same qualities and abilities of other persons." Variants include the Downing effect, "the tendency of people with a below-average IQ to overestimate their IQ, and of people with an above-average IQ to underestimate their IQ," and the Dunning-Kruger effect, "a cognitive bias wherein persons of low ability suffer from illusory superiority when they mistakenly assess their cognitive ability as greater than it is" because of the "metacognitive inability of low-ability persons to recognize their own ineptitude." Consider Dunning's & Kruger's initial study:

"[P]articipants were given specific tasks (such as solving logic problems, analyzing grammar questions, and determining whether jokes were funny), and were asked to evaluate their performance on these tasks relative to the rest of the group, … [A]ll four groups evaluated their performance as above average, meaning that the lowest-scoring group (the bottom 25%) showed a very large illusory superiority bias. … [G]iven training, the worst subjects improved their estimate of their rank as well as getting better at the tasks."
One encouraging result ostensibly seems to follow from research on the illusory superiority effect: it seems that people becoming more intelligent, educated, and/or self-aware will help them to overcome their biases. However, a 2012 University of Toronto study found that intelligence, education, and self-awareness did not help participants overcome cognitive biases. They even made the problem worse in some cases. Those traits correlated with a bigger “bias blind spot,” the assumption that other people are more susceptible to bias than oneself. The bias blind spot is caused by the ability to notice others’ bias but not one’s own, which is caused by having access to one’s own conscious thoughts but not those of others.

A 2015 study asked people to rate their knowledge on a list of academic topics, including some nonexistent topics. They found that participants who claimed to know more about the real subjects, and those who actually knew more, claimed to know about topics that do not exist – a phenomenon called “overclaiming.” Even when participants were warned that some subjects were fake, they still overclaimed. Those who believe that they are smart enough to overcome cognitive biases could not be more wrong.

Many meta-analyses have been conducted to estimate how much overconfidence distorts most people's knowledge of their own abilities. 22 meta-analyses including 354,739 participants total found that "the overall correspondence between self-evaluations of ability (e.g., academic ability, intelligence, language competence, medical skills, sports ability, and vocational skills) and objective performance measures (e.g., standardized test scores, grades, and supervisor evaluations) ... was .29.” [4]Note 4. Zell & Krizan (2014), "Do People Have Insight Into Their Abilities? A Metasynthesis" pp. 111 & 116, PDF pp. 1 & 6. Since the correlation between your abilities and your perceived abilities is less than 30%, including for your thinking abilities, you should be cautious and skeptical about your own thoughts.

3. Confirmation bias makes everyone seek out evidence that they are right, and ignore evidence that they are wrong.

"The human understanding when it has once adopted an opinion ... draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises ... And such is the way of all superstitions, whether in astrology, dreams, omens, divine judgments, or the like; wherein men, having a delight in such vanities, mark the events where they are fulfilled, but where they fail, although this happened much oftener, neglect and pass them by."
—Francis Bacon [5]Note 5. Francis Bacon (1620), Novum Organum, in E. A. Burtt, The English Philosophers from Bacon to Mill p. 36, as cited in Nickerson (1998), "Confirmation Bias: A Ubiquitous Phenomenon in Many Guises" (PDF).
Confirmation bias is the tendency to seek out or over-emphasize evidence confirming one's beliefs, and avoid or diminish evidence that contradicts one's beliefs. It is "perhaps the best known and most widely accepted notion of inferential error to come out of the literature on human reasoning," [6]Note 6. Evans (1989), Bias in Human Reasoning: Causes and Consequences p. 41, as cited by Nickerson (1998). and rightly so, because it has been called one of the "single problematic aspect[s] of human reasoning that deserve attention above all others."

Confirmation bias has often been explained in terms of cognitive dissonance, "the mental discomfort (psychological stress) experienced by a person who holds two or more contradictory beliefs, ideas, or values." Someone who is exposed to information contradicting her beliefs will experience discomfort from cognitive dissonance. Everyone is then motivated to avoid information contradicting their prior beliefs, because exposure to that information causes distress — especially since few people are willing to immediately change their beliefs as soon as they encounter any information challenging those beliefs.


The "backfire effect" is a more extreme form of confirmation bias where seeing evidence for an idea that one disagrees with can make one believe that idea less. The human brain reacts to information challenging its worldview in the same way as it reacts to a physical threat, totally rejecting that information to protect its worldview. This bias is especially problematic for someone trying to learn the truth by examining ideas they disagree with.

Here are a few examples of the research on confirmation bias and the backfire effect:
Those studies make up the tip of the iceberg of psychological research on cognitive biases. Most of them I only found because they were used as examples in popular videos and articles about the topic. This 1998 review summarizing the research on confirmation bias, [7]Note 7. The review has nine pages of references and has been cited 4,670 times as of September 2019. for example, provides a more in-depth look. Fortunately, some have quantitatively summarized the research on how much people usually delude themselves with confirmation bias. A 2009 meta-analysis of 300 studies found a "moderate preference for congenial over uncongenial information (d = 0.36)" [8]Note 8. For reference, Cohen's effect size (d) tends to range from 0.01 (very small) to 0.5 (medium) to 2 (huge). [9]Note 9. Unfortunately, much of the research on confirmation bias and selective exposure to information "remains poorly integrated and somewhat isolated," according to Smith, Fabrigar, & Norris (2008), "Reflecting on Six Decades of Selective Exposure Research: Progress, Challenges, and Opportunities" p. 477 (PDF). The Hart et al. (2009) meta-analysis has been the only one I could find so far. across the board. Even though there was some disagreement in the psychology literature about people selectively seeking exposure to information that confirms their beliefs, "the existence of selective exposure effects is no longer seriously questioned, and the conditions that produce them are extensively documented." [10]Note 10. Smith, Fabrigar, & Norris (2008) p. 478.

4. The binary bias makes everyone force many of their perceptions into known categories, and either distort or ignore perceptions which do not fit those categories.

It is common for different categorizations to influence the way we think about and perceive reality. As an example, consider first that different languages tend to categorize things differently. As Tom Scott described, many different languages categorize colors differently: one language has one word for green and blue, whereas another language has two words for blue. With that in mind, consider this:

  1. Language A and Language B categorize things 1 and 2 differently.
  2. Putting things into different categories changes how one sees and thinks about those things.
  3. So, thinking in Language A will make someone see and think about things 1 and 2 differently than thinking in Language B.
Unfortunately, psychology research provides plenty of evidence for point 2. I will describe some of it in detail below. But to summarize all of that research in two basic points:

  1. Putting things into the same mental category makes someone see, and think about, them as more similar than they really are.
  2. Putting things into different mental categories makes someone see, and think about, them as more different than they really are.
Together, those tendencies compose "The Binary Bias," the "tendency to impose categorical distinctions on continuous data." In other words, if one thinks of information as a color spectrum, humans usually chop it in half and understand the spectrum as one half versus the other. A recent series of six studies with 1,851 participants found that "[a]cross a wide variety of contexts...when summarizing evidence, people exhibit a binary bias:" [11]Note 11. Fisher & Keil (2018), "The Binary Bias: A Systematic Distortion in the Integration of Information" (PDF). "we tend to underestimate the difference between two facts...given the same categorical label, while we overestimate the difference between the same two facts…given different [ones]." [12]Note 12. Robert M. Sapolsky (2004), "The Frontal Cortex and the Criminal Justice System" p. 1788 (PDF).

The Binary Bias is so hard-coded in the brain that it even exists on the level of specific neurons: In one study, some neurons would fire on seeing "the image of a dog or a cat (but not both)." The experimenters used Photoshop to gradually morph an image from a cat to a dog. The neurons fired consistently until the image was 50% cat 50% dog, then suddenly switched: "In other words, a neuron ‘considered’ a 60% dog to have more in common with a 100% dog than with a 40% dog." [13]Note 13. Sapolsky (2004), "The Frontal Cortex and the Criminal Justice System" p. 1789.

Human mental category systems often prevent people from seeing particular things, as shown by several psychology experiments. For example, a group of people were shown a series of playing cards in quick succession including a few "anomalous cards" like "a red six of spades and a black four of hearts." Participants almost always categorized the anomalous cards with the normal cards: "Without any awareness of trouble, it was immediately fitted to one of the conceptual categories prepared by prior experience. One would not even like to say that the subjects had seen something different than what they identified." After the length of time where participants saw the cards was increased (up to 40x slower), most picked out the anomalous cards — but 10% never did, much to their own personal distress. One said, "I can't make the suit out, whatever it is. It didn't even look like a card that time. I don't know what color it is now or whether it's a spade or a heart. I'm not even sure now what a spade looks like. My God!" [14]Note 14. These examples come from Thomas Kuhn, The Structure of Scientific Revolutions pp. 62-64.

At first, most people forced their observations to match the categories they had learned. Once the observations were made more glaring, most of them could change their categories after a while, but some never could.

Is there any way to escape the conclusion that everyone is hopelessly deluded because their brains distort their thoughts and perceptions based on the categories that they have learned in our past experience? Maybe. The best that anyone can do is try to constantly break down the categories that they have learned into increasingly specific ones, and rely on those category boundaries as little as possible.