Hindsight bias

“I knew it was going to happen all along. "


Hindsight bias is the tendency to overestimate our ability to predict the outcome of an event after the outcome has been realized. Once we know the outcome of an uncertain event (such as a political election, for example), we tend to say that we knew the outcome "from the start", even though this would not have been possible [1]. Three forms of hindsight bias have been documented to date: memory distortion, overestimation of the predictability of an event, and the impression that the outcome of the event was predetermined [2]. In most cases, we modify the beliefs we had about the event prior to its outcome, and subsequently perceive the outcome of the event to have been more predictable than it was [3]. The person "anchors" their belief in their post-outcome perspective and is unable to recall what they believed before the event in question occurred [4]. We must be careful not to confuse hindsight bias with our normal learning capacity: hindsight bias only appears when, upon learning the outcome of some event, we forget the belief we had before learning what really happened. In a learning process, we should ideally keep in mind that we had, before learning the actual outcome of an event, some erroneous beliefs.


Polls in a political election, a week before voting day, show that two parties, A and B, have equal chances of victory. You are chatting with a friend, who tells you that Party A's chances of winning are equal to those of Party B. A week later, the results of the vote are announced: Party A has won by a considerable margin. You talk again with this same friend the day after the election, who now tells you that it was obvious that Party A would win and that she was certain of it the week before the election [5].


It seems that several things can lead a person to have this bias; for example, nurturing the belief that the outcome of an event must necessarily have occurred [4], or having the belief that the world is organized in a structured and determined way. These beliefs in a predetermined world can lead us to want to predict a clear and defined outcome for events which are in reality only more or less probable, and lead us to anchor our beliefs in relation to an event on only one of its possible outcomes. The moment we learn that an event has ended in a certain way, we have difficulty considering the other possibilities, since our belief is "anchored" in the outcome that has occurred. The desire to look good can also influence us: saying that we already knew the results of an event before it was resolved can make us appear intelligent [1].


Hindsight bias can result in overconfidence. If we mistakenly think that we already knew the outcome of an event before it happened, the learning that can be derived from the event is limited. This can give us inordinate confidence in our ability to predict the outcome of future events [1].

More concretely, this bias has repercussions in a wide variety of social situations, such as politics, the justice system, education, or medical practice. If a result becomes obvious to us after the fact, it can impact our judgments regarding just how obvious the outcome should have been in the first place. For example, after a condition is diagnosed, a doctor might feel guilty for not having predicted what they think should have been predictable, and thus for failing to diagnose a disease early. In the same vein, a patient whose case took much time to diagnose could question the doctor's skills, since the diagnosis now seems obvious to them [4].

Thoughts on how to act in light of this bias

  • We should try to remember the belief we had at the start, and fight against the "anchoring" in the post-outcome belief (this can be difficult given the strength of the bias) [1].

  • It helps to consider the possible alternatives to an event that has occurred, and not think that if an event has occurred, it must necessarily have occurred in this way [5].

  • Expertise in a field could help fight hindsight bias: we are more inclined to consider all the alternatives for the outcome of an event if we are familiar with the subject [5].

  • Discuss with friends the results of an event before and after its outcome, and confirm with them what our beliefs were at the outset.

  • Adopt a posture of intellectual humility: you cannot know everything or predict everything.

How is this bias measured?

Laboratory experiments to measure this bias are usually done in three steps. First, the participants of the experiment receive certain information related to an event (for example, an armed conflict between two countries). Participants are then divided into two groups: one will receive information about the actual outcome of the event (how the armed conflict was resolved), and the other will receive no information. Finally, both groups are asked to estimate the probability of each of the possible outcomes of this event as if they did not know the outcome. In general, the group that already knows the real result estimates the probability of that result to be significantly higher than the group that does not know [1].

This bias is discussed in the scientific literature:


This bias has social or individual repercussions:


This bias is empirically demonstrated:



[1] Hawkins, Scott A., & Reid Hastie (1990). Hindsight: Biased judgments of past events after the outcomes are known. Psychological Bulletin, 107(3): 311-327.

[2] Blank, Hartmut, Steffen Nestler, Gernot von Collani, & Volkhard Fischer (2008). How many hindsight biases are there? Cognition, 106(3): 1408-1440.

[3] Erdfelder, Edgar, & Axel Buchner (1998). Decomposing the hindsight bias: A multinomial processing tree model for separating recollection and reconstruction in hindsight. Journal of Experimental Psychology: Learning, Memory and Cognition, 24(2): 387-414.

[4] Fischoff, Baruch (1975). Hindsight ≠ foresight: The effect of outcome knowledge on judgement under uncertainty. Journal of Experimental Psychology: Human Perception and Performance 1(3): 288-299.

[5] Roese, Neal J., & Kathleen D. Vohs. (2012). Hindsight bias. Perspectives on Psychological Science 7(5): 411-426.


Individual level, Availability heuristic, Anchoring heuristic, Need for self-esteem, Need for security, Need for cognitive consonance

Related biases

  • Illusion of transparency 

  • Interpretive bias 

  • Synonym: “knew it all along” bias


Fabrice Valcourt, étudiant de maîtrise en philosophie à L’UQAM.

Translated from French to English by Susan D. Renaud.

How to cite this entry

Valcourt, F. (2021). Hindsight bias, trans. S. D. Renaud. In C. Gratton, E. Gagnon-St-Pierre, & E. Muszynski (Eds). Shortcuts: A handy guide to cognitive biases Vol. 4. Online: www.shortcogs.com

Receive updates on our content by signing up to our newsletter

Thank you!

Thank you to our partners

FR transparent.png

© 2020 Shortcuts/Raccourcis. All rights reserved.