top of page

Confirmation bias

“When I research a topic, I favour information that supports my hypothesis and ignore information that contradicts it.”

Definition

Confirmation bias is a tendency, often unconscious, to be overly supportive of information that confirms a hypothesis, to the detriment of information that contradicts it. This bias can occur in different ways. When information conforms to our hypothesis, it can be remembered more easily [1] or given more weight than information that contradicts it. Sources of information that support the hypothesis may also come under less critical scrutiny [2]. This bias can occur in a context where we are dealing with a subject related to emotions, opinions or beliefs, but also in a neutral context where these factors are not at play.

Example

Proponents of fossil fuels might favour information about job creation in this industry and be critical of information in favor of green energy. When discussing energy sources, they might have a vague recollection of studies on the impact of oil exploitation on climate change, but more easily recall the economic arguments in favor of it.

Explanation

Confirmation bias is thought to be linked to a heuristic called the positive test strategy [3], which aims to facilitate our processing of information. When testing a hypothesis, attempting to examine all the evidence that may contradict it (which would be a negative test strategy) is very costly in terms of effort, if not impossible. Through the positive test strategy, we seek relevant information that is specifically related to our hypothesis, which implies confirming it while eliminating alternative hypotheses. However, when we employ the positive test strategy all the while failing to imagine alternative hypotheses, we fall into the confirmation bias [4].

Consequences

The confirmation bias can give a false impression of objectivity and contribute to overconfidence and the maintenance of beliefs in the face of contrary evidence. Surrounding themselves with information and people who confirm their opinions, as in the case of echo chambers, some people will even tend to entrench themselves in increasingly rigid and extreme positions. We can then witness a polarization phenomenon that even criticisms from a scientific consensus will not succeed in curbing.

Thoughts on how to act in light of this bias

  • Accept that our mind can deceive us, despite our good intentions!

  • Reflect on alternative hypotheses and look for situations that are likely to contradict our beliefs.

  • Establish a priori an information evaluation procedure and refer to it systematically.

How is this bias measured?

The first experimental measure of the confirmation bias was the iconic “2-4-6 rule” task [5]. Participants in this experiment are asked to find the rule behind a series of numbers. Based on the information that the sequence “2-4-6” complies with the rule, participants must complete the sequence by adding numbers and check with the experimenters whether their additions comply with the rule or not. Participants tend to think of the rule as “increasing even numbers”. They thus continue to test their hypothesis by proposing an increasing sequence of even numbers, until they believe that they have confirmed their hypothesis. But, by only trying to confirm their hypothesis, the participants do not realize that the series of numbers conforms in reality to this simpler rule: “series of increasing numbers”. If they had submitted a potential falsifier of their hypothesis, such as the sequence “7-8-9”, a positive feedback from the experimenter would have allowed them to adjust their initial hypothesis and thus potentially discover the rule in question. We observe a confirmation bias when participants indicate having found the (wrong) rule without having tested the potential alternatives to their hypothesis.

This bias is discussed in the scientific literature:

Echelle1.png

This bias has social or individual repercussions:

Echelle2.png

This bias is empirically demonstrated:

Echelle3.png

References

[1] Frost, Peter, Bridgette Casey, Kaydee Griffin, Luis Raymundo, Christopher Farrell & Ryan Carrigan (2015). The influence of confirmation bias on memory and source monitoring. The Journal of General Psychology, 142(4), 238-252.


[2] Jones, Martin & Robert Sugden (2001). Positive confirmation bias in the acquisition of information. Theory and Decision, 50(1), 59-99.


[3] Klayman, Joshua & Young-won Ha (1987). Confirmation, disconfirmation, and information in hypothesis testing. Psychological Review, 94, 211–228.


[4] Oswald, Margit E. & Stefan Grosjean (2004). Confirmation bias. In Rüdiger F. Pohl (Ed.), Cognitive illusions: A handbook on fallacies and biases in thinking, judgement and memory (pp. 79–96). New York: Psychology Press.


[5] Wason, Peter C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12(3), 129-140.

Tags

Individual level, Anchoring heuristic, Need for cognitive closure

Related biases

Author

Janie Brisson, PhD, Postdoctoral researcher, Centre National de la Recherche Scientifique (CNRS), Université de Paris.

Translated from French to English by Susan D. Renaud.

How to cite this entry

Brisson, J. (2020). Confirmation bias, trans. S. D. Renaud. In C. Gratton, E. Gagnon-St-Pierre, & E. Muszynski (Eds). Shortcuts: A handy guide to cognitive biases Vol. 1. Online: www.shortcogs.com

Receive updates on our content by signing up to our newsletter

Thank you!

Thank you to our partners

FR transparent.png
LogoCielmoyen.png
LOGO-ISC.png

© 2020 Shortcuts/Raccourcis. All rights reserved.

bottom of page