top of page

Heuristics and cognitive biases

This section aims to explore in more depth the mechanisms underlying cognitive biases and other phenomena discussed on this website, as well as their links to heuristics (or cognitive shortcuts). We provide an overview of these concepts and of the general theoretical framework that underlies the categorization of biases that is proposed on this site.

Heuristics

What is a heuristic?

 

Heuristics are often thought of a “rules of thumb” which are used to simplify a complex cognitive task. The word heuristic, of Greek origin, means ‘which serves to discover’ [1], and shares the same root as the word eureka [2]. For Daniel Kahneman, recipient of the Nobel prize in economics, heuristics are cognitive shortcuts which we use when the requirements of a cognitive task are too high. For the German psychologist Gerd Gigerenzer, a heuristic is “a strategy that ignores part of the information, with the goal of making decisions more quickly, frugally, and/or accurately than more complex methods” [1]. Here are some examples of very complex tasks the (often approximate) accomplishment of which sometimes requires using a shortcut [2]:

​

  • Estimating the probability that an event will happen (e.g. predicting the results of an election).

  • Validating a hypothesis (e.g. trying to determine whether going to bed early before an exam will systematically lead to a better grade).

  • Evaluating the frequency of an event (e.g. determining the number of COVID 19 cases in Quebec).

 

In other words, we can conceive of a heuristic as a strategy that allows for a simple answer to a complex question: if a satisfactory answer to a complex question does not come to mind quickly, we will replace the complex question by a similar, but simpler, question.

Because we do not know the future, and so many things could come into play, it would be difficult to answer the following question precisely:

(1)

“Will this politician fulfill her electoral
promise in the next 6 months?”

By using a heuristic, we could replace this question with the following:

(2)

“In my opinion, has this politician been
trustworthy so far?

Which we can answer more easily. The answer to the simple question (2) can serve to answer the initial, more complex question (1).

Examples of heuristics

 

While heuristics often allow us to find satisfactory answers very quickly, they can also lead to incorrect judgments and perceptions. When these errors are systematic, they are called cognitive biases.

 

Among the theoretical models which have been developed in order to understand and describe heuristics and cognitive biases, the one proposed by Daniel Kahneman and Amos Tversky is particularly interesting [3]. They offer a description of the most common heuristics as well as a non-exhaustive list of biases to which these heuristics can lead. Here are some examples of these shortcuts.

Representativeness heuristic

Context: We tend to use this heuristic when we need to identify the link between two elements.

Examples:

​

Examples

  • Determining whether an individual belongs to a category (Is this person a farmer or an engineer?)

  • Determining the probability that an action caused an event (Is my failing of the exam due to the fact that I missed some classes?)

 

Use: When we need to identify the link between two elements, it is often impossible to obtain all the information necessary to form a sure conclusion. So instead, we tend to rely on our impression of similarity (or representativeness) between the two elements to draw conclusions.

For example, rather than exhaustively checking whether an individual belongs to the ‘farmer’ category by checking whether they conform to all the attributes which define a farmer (e.g. owning a farm or working on a farm, tending to livestock or fields), we will simply ask ourselves whether the person is representative of the category in question. In other words, we try to see if the person shares similarities with the group. Likewise, though we cannot be certain that the failure of an exam was caused by missing of classes, we can still determine that it is the kind of action which is representative of students who fail their classes, and thus draw conclusions from this shortcut.

Potential biases: This heuristic, though necessary for our efficient functioning, can lead to many reasoning errors. Here are examples.

​

Availability heuristic

Context: We tend to use this heuristic when we need to evaluate the frequency of an event, or the probability that it will happen.

​

Examples

  • Determining whether an event will happen to us (Winning the lottery).

  • Evaluating the frequency of an event (Determining the number of COVID-19 cases in Quebec).

 

Use: We cannot always obtain all the necessary information to accurately predict the probability that an event will happen. Unfortunately, we still must regularly make predictions, and we do our best by relying on the examples available to us in our memory.


We don’t need to know the precise statistics relating to our chances of winning the lottery (which hover around 1 in 28 633 528) to know that our odds are slim. We just have to think about the fact that we don't know anyone in our social circles who has had that luck. Likewise, we know that it is unwise to leave our home without locking the door, because we have several burglary anecdotes in mind, despite the fact that we do not know the precise statistics related to this crime. In other words, when the availability of an event in memory is low, we tend to believe that the probability of an event is negligible. Conversely, when memory availability is high, we tend to think that the probability of an event is high.

Potential biasesThis heuristic, though necessary for our efficient functioning, can lead to many reasoning errors. Here are examples.

​

Adjustment and anchoring heuristic

Context: We tend to use this heuristic when we need to make estimates without knowing everything about the information to estimate.

​

Examples

  • Estimating the population of a country

  • Estimating the age of an individual

  • Estimating when an object was invented

 

Use: Sometimes we do not have the specific knowledge needed to make an effective estimate, so we rely on other facts (be they relevant or not) to get closer.

 

If we do not know the population of a country, we can produce an estimate from the population of a country we know which has similar characteristics. For example, if asked to estimate the population of Norway (and we have no clue), we might fall back on our knowledge of its neighbour Sweden, which we know as the most populous country in Scandinavia. That country has more than 10M inhabitants; this number becomes the anchor point. The adjustment heuristic will make us assume that Norway is probably less populated, so we will adjust our answer based on the anchored knowledge. We could answer that Norway has 7M inhabitants (when in reality it only has 5M), which is not too far off the mark.

Potential biases: This heuristic, though necessary for our efficient functioning, can lead to many reasoning errors. Here are examples.

​

Where do heuristics come from?

​

According to evolutionary theory, our brains’ capacities evolved in response to the challenges faced by humans in prehistory, such as feeding ourselves, recognizing dangers quickly, reproducing, or making allies. The environmental pressure exerted by these necessities, combined with the possibilities offered by human genes, would have induced the development of certain attributes (see list below). These attributes would in turn have favoured the emergence of the heuristics discussed previously [2]. Here are a few examples of evolved attributes which characterize our cognitive functions.

We are constantly generating impressions and formulating beliefs and intentions.

Example: You meet an old friend from high school in the street. You are at first surprised, then excited to see her, and judge at a glance her appearance, to try to see what kind of a person she has become.

The mental operations that populate our brain can happen automatically and quickly.

Example: You see a mathematical formula (e.g. 5 X 4) and automatically think of the result, without having consciously decided to do the operation.

We identify patterns among facts and events.

Example: While looking at clouds, you can’t help but see shapes like animals, faces, etc.

We tend to neglect doubt and ambiguity in favour of certainty.

Example: Your boss sharply calls you to a meeting with her, giving you no specific details. In the meantime, you are uncomfortable not knowing the reason for this interview: you need to be reassured, to be told by your colleagues that your work is without a doubt appreciated.

We tend to want to confirm and maintain our beliefs.

Example: You didn’t like the movie you just saw at the theatre, so you search online for negative reviews of the film in order to refine your explanations of your dislike rather than looking for more nuanced critiques that would lead you to see things differently.

We are constantly monitoring our environment and updating our information about it.

Example: You are in the middle of a hockey game and break away with the puck. In a fraction of a second, you identify the position of the defence and those of your teammates, then manage to make a pass that leads to a goal despite the traffic in front of the net and the speed of the game.

Heuristics, in this context, represent means which were developed using our cognitive capacities, in order to accomplish the complex tasks that we have had to face throughout our evolution. Gerd Gigerenzer put forward the idea of an adaptive toolbox that would include heuristics [1]. This toolbox is said to be adaptive insofar as it allows us to adapt to demanding situations by providing a solution that is at least approximate, and at best adequate. This toolbox which contains heuristics also contains other capabilities and mechanisms such as memory, recognition (which allows us to recognize a face time after time, for instance), and more. As for heuristics, they are tricks that allow us to accomplish complex tasks in a simple manner. Just like humans, other species have heuristics of their own, which vary according to their mental capacities.

Are all our judgments the result of a heuristic?

 

It is said that our brains contain an adaptive toolbox which includes cognitive processes like memory, but also heuristics, to help us think about the world around us. But to what extent are heuristics used? One may wonder whether all our judgments are the result of these cognitive shortcuts. There is, at the moment, a consensus that not all our judgments are the product of heuristics. It is quite possible to imagine that in a context where neither time nor knowledge is lacking, deliberate, informed and open-minded judgment can be the result of rational thinking free from shortcuts or biases. For example, making an informed decision and weighing the pros and cons of moving out of one’s house would not, a priori, elicit any particular heuristic. All the same, it remains excessively interesting and relevant, even necessary, to talk about heuristics, which in many cases are one of the phenomena that can lead to judgments and biases.

An interesting case of a judgment that is not the result of a heuristic despite the demanding nature of the task is expertise. Indeed, being an expert in a field can lead to producing intuitive and correct judgments, though without resorting to an approximate rule (or heuristic). A very curious example, which is well-studied by psychologists, involves people who succeed in determining the sex of chicks at lightning speed, producing extremely precise and correct judgments using only a tiny handful of clues. Chess players who are unable to explain how they come up with the best move in a series of exchanges, or wine tasting experts who can guess which type of grape is used with a rate of accuracy beyond comprehension are other examples of effective and correct judgments in the absence of both deliberative thought and shortcuts.

Efficient shortcuts: the good side of heuristics

​

While Kahneman is mainly interested in the biases which can arise from heuristics, psychologist Gerd Gigerenzer is interested in the more positive aspects of the use of these shortcuts, and focuses on the various adequate or appropriate judgments that are the result of heuristics. He demonstrates that heuristics are an unavoidable, quick and frugal way of giving an answer to a question when the information is lacking or when the requirements of the task are beyond our capabilities [5]. For Gigerenzer, the question is not whether we can trust heuristics, but in which contexts we can trust them. Here is an example proposed by Gigerenzer, based on the recognition heuristic.

You are asked the following question: Which city has the largest population, Detroit, or Milwaukee?

 

Few people know the answer. In such a case, we can turn to a rule based on the recognition heuristic: if you recognize the name of one of the two cities but not that of the other, then infer that the one you know has the largest population.

​

American students, being familiar with both city names, could not use the recognition heuristic, and about 40% of them responded: Milwaukee [5]. German students, who for their part recognized Detroit, but not the other city, got the correct answer 100% of the time. Indeed, Detroit had a little less than 700,000 inhabitants in 2018, while Milwaukee had a little less than 600,000 for the same year.

This example demonstrates that sometimes a minimal level of knowledge is more desirable than a higher level, and that heuristics can lead to a better performance than deliberation when faced with multiple related pieces of information.

 

Gigerenzer provides even more examples of heuristic that show that taking such shortcuts can very often lead to a correct answer or a better performance on a specific task. However, he is not interested in the biases that can result from the use of heuristics.

Cognitive biases

Cognitive biases refer to identifiable and indexable errors that are found in our judgments, which are predictable and systematic. These errors occur when people have to interpret and deal with information from the world around them. No one is completely immune to them, and certain contexts and factors make them particularly prevalent.

 

It can be interesting, in order to study the notion of a cognitive bias, to start by looking at definitions of what a ‘bias’ itself is.

Some general definitions surrounding the notion of bias

​

There are a multitude of definitions of ‘bias’ in dictionaries and specialized encyclopedias of psychology. Here are some examples of definitions of ‘bias’ that may be relevant to introduce the notion.

(1)

“The action of supporting or opposing a particular person or thing in an unfair way, because of allowing personal opinions to influence your judgment” (Cambridge Dictionary)

(2)

The word ‘bias’ in French can also have the following meaning: “Indirect and skillful way of solving a problem: Looking for a bias to avoid work.” (Larousse Dictionary, our translation)

And here are two definitions on the notion of cognitive bias:

(3)

“Systematic errors… [that] recur predictably in particular circumstances” [2, our translation]

(4)

“cognitive illusion… that… leads to a perception, judgment, or memory that reliably deviates from ‘reality’” [6, our translation]

Using these definitions as a foundation, we will emphasize a few crucial elements which capture the essence of cognitive biases.

Unconscious or implicit judgment

​

The first two definitions, taken from common dictionaries, do not concern cognitive biases specifically, defining instead biases in general. In the first definition, it is interesting to note that the authors emphasized the intentional dimension of biases. Within cognitive science, such a dimension is not the norm. On the contrary, we prefer to speak of unconscious and involuntary biases [6]. However, the idea that a preferred answer will appear instead of a correct answer is a feature of cognitive biases. When referring to cognitive biases, it is typical to talk about a judgment that has been tainted, or a perception which is the result of an illusion. The fact that biases are unintentional also highlights another aspect of their manifestations: they are very difficult to avoid and to correct.

Bias versus heuristics

​

The terms bias and heuristic are very closely related, but an exhaustive reflection on biases makes clear that it is best to distinguish them clearly. Definition (2) sees a bias as a means or a strategy. In the vocabulary of cognitive psychology, the means or strategy refers more aptly to heuristics. In this respect, a bias is seen as the unfortunate outcome of a heuristic. We say of a response that it is biased when it includes a dimension which deviates from reality, from a norm, or which is an outright error. Heuristics and biases are therefore not synonyms.

Systematic and predictable error

​

Definitions (3) and (4) mobilize the “systematic” dimension of cognitive biases. This dimension implies being able to specifically predict how a response, judgment or perception will deviate from the norm or from reality. Great precautions and important controls are put in place in experimental psychology to confirm the existence of a bias. To label something as a cognitive bias, it is therefore necessary to be able to explain and predict how it emerges, to what it is owed, and how it manifests itself in a systematic way, which is to say that the occurrence of the bias will not depend on chance, or be produced arbitrarily. We must be able to predict in what context it will appear, and how.

The take-away

Cognitive biases have an unintentional dimension, which is automatic or implicit, as well as being systematic. To speak of a cognitive bias, we must be able to measure it and to predict the conditions which bring it about.

Categories of cognitive biases

​

Although several models have been proposed to categorize cognitive biases, none of them really allow for the taking into account of all the systematic errors identified in all areas of cognition. There are always biases which belong to several—or to none—of the proposed categories. For this reason, we have chosen to categorize each bias discussed in the present guide according to several models, in order to obtain a more precise and representative categorization, in line with the various research programs that focus on cognitive biases.

​

 

Biases can be categorized according to the heuristics underlying them; this is Daniel Kahneman’s approach:

  1. Availability heuristic

  2. Representativeness heuristic

  3. Anchoring and adjustment heuristic

  4. Emotional heuristic

Another way to categorize biases is to identify the social sphere in which they manifest themselves or have an impact. The categorization according to social level is not exclusive: a given bias can manifest itself at multiple levels.

  1. Individual level

  2. Interpersonal level

  3. Intergroup level

We can also classify biases according to the psychological need that the use of the bias can fulfill. A psychological need refers to an inherent tendency in individuals which is the basis of their internal motivation and necessary for their well-being [7]. Researchers generally agree on the identification of three fundamental needs (social belonging, competence, and autonomy), but here we extend this theory in order to include other needs which are sometimes studied independently to explain phenomena such as heuristics.

  1. Need for cognitive closure [8]

  2. Need for self-esteem

  3. Need for social belonging

  4. Need for security

  5. Need for cognitive consonance

It should be emphasized that these categorizations should be viewed as tools to facilitate discussions about biases, and that they are not exhaustive, exclusive, nor independent of one another. In other words, a bias can respond to an as-yet-undiscovered psychological need, just as it may be the result of more than one need or heuristic.

References

​

[1] Gigerenzer, Gerd & Wolfgang Gaissmaier (2011).  Heuristic Decision Making. Annu. Rev. Psychol. 62:451–82

 

[2] Kahneman, Daniel (2011). Thinking, Fast and Slow. Anchor.

 

[3] Tversky, Amos & Daniel Kahneman (1974). Judgment under uncertainty: Heuristics and biases. science, 185(4157), 1124-1131.

 

[4] Moon, Andrew (2012). Knowing without evidence. Mind 12(482): 309-331.

 

[5] Gigerenzer, Gerd (2007). Gut Feelings. The Intelligence of the Unconscious. Penguin.

 

[6] Pohl, Rudiger F. (2017). Cognitive Illusions: Intriguing Phenomena in Thinking, Judgment and Memory. Routledge.

 

[7] Ryan, Richard M. (1995). Psychological needs and the facilitation of integrative processes. Journal of personality, 63(3), 397-427.

 

[8] Webster, Donna M. & Arie W. Kruglanski (1994). Individual differences in need for cognitive closure. Journal of personality and social psychology, 67(6), 1049.

The authors

​

Cloé Gratton and Émilie Gagnon-St-Pierre are PhD candidates in cognitive psychology at the Université du Québec à Montréal. They are also co-founders of SHORTCUTS.

 

Translated from French to English by Eric Muszynski.

​

Quote this entry

​

Gratton, C, & Gagnon-St-Pierre, E. (2020). Heuristics and cognitive biases, trans. E. Muszynski. Shortcuts: A handy guide to cognitive biases Vol.2 . Online: www.shortcogs.com

The authors would like to warmly thank Gaëtan Béghin and Eric Muszynski for their thoughtful reflections and extremely relevant comments on previous drafts of this entry.

bottom of page