top of page

Cognitive biases and workplace dynamics

This section aims to deal with the subject of heuristics and cognitive biases in the context of workplace organizations such as businesses, companies, factories, etc. More specifically it deals with issues that arise which affect their performance and survival. For example, the work climate may deteriorate, employees may demobilize and disengage, or managers may have difficulty demonstrating leadership.

 

If we want to effectively address these issues, it is essential to know their causes, as well as to fully understand the organization and its members. In this context, organizational consultants, who can come from different fields such as management, human resources (HR), or work and organizational psychology, can establish a diagnosis of the problems identified. As psychologist Clayton P. Alderfer maintains, an appropriate diagnosis makes it possible to select the appropriate actions to deal with a problem and thus to restore performance or quality of life within the organization [1].

 

However, as in the medical field, when a diagnosis is made, there is a question of interpretation. It is not only a question of the interpretation on the part of the consultant who must make the diagnosis, but also of that of the members of the organization who provide the information on which the diagnosis is based. Mental shortcuts, also called heuristics, come into play here!

 

Heuristics can be beneficial because they facilitate and accelerate reasoning, thereby enabling the consultant and members of the organization to make quick decisions and contribute to reaching an effective diagnosis. However, heuristics can also lead to systematic errors, or cognitive biases, which then lead these individuals to draw false conclusions from their analyses. Consultants must therefore reduce the influence of their cognitive biases and those of members of the organization, as the validity of the diagnosis may otherwise be impaired [2]. Indeed, a poor diagnosis could lead to ineffective decisions, changes and actions that would not correct the problem initially targeted or might even aggravate it.

 

Such decisions or actions, whether they are based on a biased diagnosis or not, have a definite impact on the work climate and life in the organization, but also on the life of each person affected.

 

Thus, this section deals with the main heuristics present during an organizational analysis, namely the heuristics of availability, anchoring, representativeness and emotionality. For each of them, a selection of cognitive biases that frequently influence organizational diagnosis will be presented. Finally, a series of strategies to limit the influence of cognitive biases will be proposed.

The availability heuristic

In general, it is easier to remember events that happen frequently, that are familiar, or that have happened recently, than events that are more distant. The ease with which we recall these memories or this information influences our judgment in relation to this information: this is the availability heuristic, which can be useful and advantageous when trying to identify and understand a phenomenon that occurs in an organization. For example, a consultant with an expertise in workplace harassment who has just completed research for the writing of a book on the subject will have more knowledge of it readily available. The consultant’s effectiveness in diagnosing a harassment problem in a team will be improved. However, the availability heuristic can lead to many biases. The following are two examples of biases that can commonly arise in the context of organizational diagnosis.

 

First, the choice supportive bias arises when an individual overestimate the benefits of an option after choosing it. A consultant who exhibits this bias might, for example, insist upon using the results of a survey that received a very low response rate. Surveys can gather a lot of information in a short time, and statistical analysis of the data collected can lead to objective conclusions, provided enough people have participated. However, with few participants, the data cannot accurately represent the situation of the entire organization, and the resulting conclusions will necessarily be biased or even erroneous. Unfortunately, many consultants mistakenly use such data, thinking "it can serve as a starting point." But if this initial idea points in the wrong direction, can it be of value? Going beyond the choice justification bias would involve continuing to collect data using another method (for example, collecting evidence from focus groups) or readministering the survey with better preparation.

 

The repetition effect manifests itself when an individual’ belief in information increases as it is repeated.. For example, the first employees volunteering for interviews leading to their workplace diagnosis could repeatedly provide similar information about the quality of supervision in the organization. If the employees who make these repetitive comments are similar and have the same job, they will not be representative of the different groups made up of other employees within the organization. The repeated information will not reflect the perception of all members of the organization, but the repetition effect may make their information seem more believable, by virtue of the fact that it is repeated. By considering the first perceptions gathered as facts, the diagnosis could then be biased.

The anchoring heuristic

The anchoring heuristic refers to the process by which an individual estimates an uncertain value or arrives at a conclusion from an "anchor" point. This initial anchor point can be an arbitrary value, calculated in a preliminary manner, or information known in advance. For example, a consultant wanting to estimate the number of employees likely to leave their jobs in a shop after a difficult year (estimate of an uncertain quantity) could estimate that this figure will be very close to that of another similar shop that they had previously analyzed. In this case, the number associated with the other shop serves as an anchor. For this heuristic, we explore three biases likely to influence an organizational diagnosis.

 

Confirmation bias is the tendency to interpret available information to align it with a predetermined hypothesis while ignoring other possibilities. For example, an employer might believe that the poor performance of employees is related to their lack of training. The latter might simply dismiss a consultant's diagnosis demonstrating that it is rather a communication problem between employees and managers, based on a survey and rigorous observations. In this case, we are dealing with confirmation bias: the employer's initial hypothesis takes precedence over the alternative hypotheses because they do not correspond to previous beliefs. In this case, the poor performance of employees gives the illusion of confirming the employer's prior assumption, while other information that may contradict this assumption is ignored.

 

Automation bias is also very common in organizational settings. It consists in administrators having a disproportionate confidence in the information offered by a machine, a tool or software, accepting the proposed results uncritically. In organizational diagnosis, statistical analyzes are frequently used. For example, analyses make it possible to compare work teams and determine which ones are performing better than others. However, before performing these analyses, there is a whole series of prerequisite checks to ensure that the data is consistent for proper analysis. The main problem with statistical software is that regardless of the quality of the data, the software performs the calculations and presents conclusions, whether the data meets the prerequisites that allow it to be analyzed or not. In our example, junior consultants or those with limited statistical knowledge might not know that prerequisite analyses are needed, as is often the case. By trusting the conclusions because they come from recognized software, they could conclude, possibly wrongly, that one of the teams stands out from the others, when in fact it does not, due to the automation bias.

The heuristic of representativeness

The representativeness heuristic refers to the assessment of the probabilities that an object or event belongs to a certain category, based on its similarity or commonality with that category. For example, after having quickly skimmed over the first page of a resume, a consultant might be correct in believing that an individual is a manager since he studied in a management school and seems to have a lot of experience in this domain. However, many biases are linked to the heuristic of representativeness. Here are two, in the context of an organizational diagnosis:

 

Amos Tversky and Daniel Kahneman explain that sample-size insensitivity is a bias that involves overestimating the representativeness of a small sample relative to its reference population [3]. Take, for example, a consultant who concludes that the working climate in a hospital has deteriorated, based on a survey of 100 employees, regardless of the size of the organization and the total number of employees. If the hospital employs 8,500 people (the reference population), a response sample of 100 would be too small, even though the number 100 seems relatively large. For a sample to be representative, there are several statistical rules that take chance into account, and the sample size should always be adjusted according to the reference population. Failure to take this into account and intuitively trusting what we consider to be a “large” number therefore exposes the consultant to this bias.

 

Another error that can result from the representativeness heuristic is the ingroup bias, which results in a more positive perception of the members of the group to which one belongs than others. In an organizational context, a work and organizational psychologist might overestimate the credibility of data collected by colleagues belonging to the same professional group, and underestimate that collected by human resources counselors (a separate professional group). The psychologist might therefore ignore information that would have been relevant to the diagnosis.

Emotional heuristics

Finally, emotional heuristics arise when our decisions, judgment, and behaviours are unconsciously affected by our emotions. For example, a consultant might feel particularly happy on the day of the first meeting with management, making a good impression with his enthusiasm and thus obtaining the mandate with the organization.

 

However, our emotions can skew our behaviours and our thoughts. Two biases are notably linked to this heuristic. First, impact bias can arise when an individual overestimate the intensity and duration of the emotions they will experience in the face of an event. A consultant could for instance anticipate a mandate in a negative way and therefore prepare for an interview with the manager for many hours, because the manager is known to be intimidating. After a gentle interview, the consultant realizes that the hours of agonizing preparation were not justified. The hours lost could have been used to establish solutions or corrective measures to employee problems, which would have been more useful for both the management and the employees.

 

Finally, the optimism bias could be associated with "magical thinking". This refers to individuals who minimize the impact of the risks they take by convincing themselves that they will not experience the consequences associated with those risks. For example, the optimism bias might manifest itself in an overly enthusiastic junior consultant independently accepting a mandate to diagnose a complex and protracted conflict in a factory. Identifying the determinants of serious conflict needs to be undertaken carefully and requires advanced skills, but the consultant feels this is a learning opportunity and nothing bad could happen. After an awkwardly directed group interview that resulted in elevated tensions, a series of physical assaults and acts of vandalism took place. The overly optimistic consultant therefore made the situation worse for the organization.

Strategies to limit the influence of cognitive biases

Through professional practice and research in fields relevant to diagnostics (e.g., psychology, medicine, engineering), several strategies to minimize or counter biases have been identified. Here are a few:

Knowledge of cognitive biases and reflection on thinking

First, being aware of the existence of biases and their impact on the diagnostic process would allow more rational thinking to be incorporated into intuitive reasoning and is the first step in being able to minimize the effects of these biases [4]. As for reflection on thought, this aims at the deliberate slowing down of the thought processes, which makes it possible to become aware of potential biases or errors that could affect judgment and decisions. To achieve this, diagnosticians propose two strategies which may be applied to organizational diagnosis [4] namely: the search for alternative explanations and the exploration of the potential consequences of having chosen a bad diagnosis.

 

First, finding alternatives is an ideal strategy to decrease the occurrence of confirmation bias. It aims to maximize the number of alternatives considered before making a final diagnosis. This avoids becoming attached to a preliminary diagnosis that is too rigid or too simple. To achieve this, the consultant can ask questions such as: “Is there another possible cause/reason?”  “Are there other elements that may be considered?” Thus, consultants are not limited to confirming the validity of one hypothesis but should also seek to exclude the possibility that the problem results from another cause.

 

Second, exploring the potential consequences of choosing the wrong diagnosis pushes us to consider the "worst-case scenarios”. It aims to imagine a future in which a wrong diagnosis is made and involves asking questions such as “What went wrong?” and “What could have been fixed?” This method helps to limit blind spots and leads the consultant to move away from first impressions [4].

Knowledge of organizations and experience

Before making a diagnosis, a competent consultant should be familiar with four areas of knowledge: 1- types of organizations (e.g. public, private, community, etc.), 2 - the types of problems that are commonly found there (e.g. inefficiency, conflicts, etc.), 3- the characteristics of organizations and teams that are doing well (to establish targets to be reached) and 4- typical solutions applied in the event of a problem [5].

 

The more knowledge and experience consultants acquire in these four areas, the more they will be able to consider different alternatives, as mentioned above. In addition, their experience allows them to better understand their own strengths and weaknesses and thus to recognize the signs that they may be influenced by their own biases. All of this, of course, is conditional on the consultants being introspective and willing to seek help and feedback from colleagues, especially with complex diagnoses [6]. This nuance is important, because it is important to remember that, experience can be a double-edged sword: it can, among other things, fuel the anchorages and contribute to confirmation bias. For example, just because a consultant knows the education system well does not mean that all schools should be "understood" in the same way!

The diagnostic model

In order to establish an organizational diagnosis that is as unbiased as possible, it is possible to use tools that limit the impact of the misperceptions of consultants or members of the organization. Researchers suggest using the funnel model, which explains the stages of the diagnostic process as well as the precautions to be taken to limit the introduction of bias [7]. These researchers propose, as a first step, to ask open-ended questions of the actors concerned (e.g. employees, managers, etc.), in order to maximize the possibility of obtaining information that is as varied as possible [8]. For example, the consultant could ask the employees of an organization what elements they think should be improved in their workplace, without offering them any possible answers prior to the survey. Thus, the problematic elements emerge from the organization and not from the perceptions of the consultant. This method of investigation allows the consultants to gradually target the most important issues by adopting an open posture that reduces the influence of their own biases, as well as the influence of these on the respondents. Indeed, by leaving the respondents the freedom to state the issues they perceive, their answers are not influenced by targeted questions of the consultant. Respondents can address new or unexpected aspects of their situation and focus on issues that are important to them.

 

Still according to the same researchers [7], the use of a combination of scientific questionnaires, observations and testimonies drawn from interviews makes it possible to analyze the situation from different angles and therefore, to consider a range of variables that otherwise might have been overlooked. For example, following the completion of a questionnaire about employee commitment to work, the consultant might believe that this is a good avenue to consider in order to alleviate a problem of staff turnover. However, by supplementing the analysis with data collected by open-ended questions, the latter could realize that the most important variable to explain this staff turnover was in fact bad management practices and not directly employee engagement.

Standards and regulations

Finally, standards and regulations can also reduce the incidence of cognitive biases. These refer to strategies that encourage consultants to devote more attention, time, or effort to a task [8]. For example, a psychologist or a certified human resources advisor is regulated according to standards of practice. The establishment of ethical rules or standards regarding the actions of consultants exerts pressure on them and encourages them to pay more attention and be more diligent in their work and thus reduce the influence of biases in their analyses [8]. The occasional, random inspection, or inspection in response to a complaint made to a professional order, subjects the consultant to a code of conduct that is as irreproachable as possible.

References

[1] Alderfer, Clayton P. (2011). The Practice of Organizational Diagnosis: Theory and Methods. Oxford University Press.

​

[2] Armenakis, Achilles A., Kevin W. Mossholder, & Stanley G. Harris (1990). Diagnostic bias in organizational consultation. Omega 18(6): 563‑72.

​

[3] Tversky, Amos, & Daniel Kahneman (1974). Judgment under uncertainty: Heuristics and biases. Science 185(4157): 1124‑31.

​

[4] Trowbridge, Robert L. (2008).  Twelve tips for teaching avoidance of diagnostic errors. Medical Teacher 30(5): 496‑500.

​

[5] McFillen, James M., Deborah A. O’Neil, William K. Balzer, & Glenn H. Varney (2013). Organizational diagnosis: An evidence-based approach. Journal of Change Management 13(2): 223‑46.

​

[6] Graber, Mark L., Stephanie Kissam, Velma L. Payne, Ashley N. D. Meyer, Asta Sorensen, Nancy Lenfestey, Elizabeth Tant, Kerm Henriksen, Kenneth LaBresh, & Hardeep Singh (2012). Cognitive interventions to reduce diagnostic error: A narrative review. BMJ Quality & Safety 21(7): 535‑57.

​

[7] Gregory, Brian T., Achilles A. Armenakis, Nathan K. Moates, M. David Albritton, & Stanley G. Harris (2007). Achieving scientific rigor in organizational diagnosis: An application of the diagnostic funnel. Consulting Psychology Journal: Practice and Research 59(2): 79‑90.

​

[8] Larrick, Richard P. (2004). Debiasing. Dans Blackwell Handbook of Judgment and Decision Making, édité par Derek J. Koehler, et Nigel Harvey, 316‑37. Blackwell Publishing.

Authors

Amy-Lee Normandin*, PhD candidate in work and organizational psychology

Catherine Desautels*, PhD candidate in work and organizational psychology

Gabriella Decoste*, PhD candidate in work and organizational psychology

Fatima Imsirovic*, D.Psy (cand.)

Maxime Paquet*, Ph.D.

 

*Department of Psychology, section of work and organizational psychology, Université de Montréal.

​

Translated by Susan. D. Renaud.

Acknowledgements

The authors would like to thank Vincent Roberge, François Benoit, Philippe Desmarais and Samuel Gilbert for their contribution to this article.

How to cite this entry

Normandin, A.-L., Desautels, C., Decoste, G., Imsirovic, F., & Paquet, M. (2021). Cognitive biases and workplace dynamics, trans. S. D. Renaud. In C. Gratton, E. Gagnon-St-Pierre, & E. Muszynski (Eds). Shortcuts: A handy guide to cognitive biases Vol. 4. Online: www.shortcogs.com

bottom of page