graphic

Human thinking is based on acquired, processed, and stored knowledge. Personal beliefs, emotions, and experiences are also stored and incorporated into thought processes. Bias is defined as a preference toward or away from an idea or action. It is a personal reality based on perception of available information that includes knowledge and emotional inputs. Cognitive bias is an error in thinking that influences how we make decisions. It can result in a closed-minded or prejudicial approach and may be learned or innate. Biases may lead to distorted perceptions or inaccurate judgment. Both cognitive and implicit biases are innate to human thinking and cannot be eliminated. Consequently, bias is an inevitable and sometimes unwelcome contributor to diagnostic reasoning (Diagnosis 2014;1:23-7).

It has become well established that cognitive biases are a significant problem in medical decision-making. Flaws in judgment, rather than lack of knowledge, are central to diagnostic error (BMJ Qual Saf 2013;22:ii58-ii64; Otolaryngol Clin North Am 2019;52:35-46). Arnot describes biases as predictable deviations from rationality, and lack of ongoing improvement in quality and safety initiatives may be due to the failure to focus on the importance of diagnostic errors and the role of cognitive bias in these errors (Information systems journal 2006;16:55-78; N Engl J Med 2013;368:2445-8; Healthc Q 2012;15:50-6). The magnitude of the problem is alarmingly high. The diagnostic error rate is estimated to be 10%-15% (Otolaryngol Clin North Am 2019;52:35-46). A study of closed claims cases found that 64% of the outcomes were due to diagnostic error, and most of those were thought to be due to a failure of judgment (Ann Intern Med 2006;145:488-96). Other studies of diagnostic delay and misdiagnosis also show a high rate of diagnostic error due to cognitive bias (Acad Med 2019;94:187-94).

Despite varied efforts to improve patient safety in medicine, cognitive bias is not discussed much. It might be uncomfortable to admit that we can make diagnostic errors, but we all perform our day-to-day clinical decision-making reliant on pattern recognition and heuristics, and biases are incorporated into our thinking. Heuristics are fast, intuitive, efficient, and low-stress mental techniques that suit the high-paced, stressful environments like ORs. The dual-process theory model of decision-making describes two mental processes in learning and decision-making. Heuristics are type 1 thinking processes that are adaptive and allow for rapid decision-making. Type 2 processes, known as metacognition, are analytical, slower, deliberate, and conscious processes. Type 1 decision-making is most used and usually effective, but more prone to error than type 2 decision-making. Our daily actions are conducted based on serial associations, which tend to trigger the next, side-stepping analytical thinking and incorporating errors in a domino effect. Type 2 processes are slow and require cognitive resources but are safe and dependable. However, there can be flaws in type 2 thinking when analytically applied strategies are based on flawed rules (Science 1974;185:1124-31; BMJ Qual Saf 2013;22:ii58-ii64).

Stiegler et al published a pilot study of cognitive errors in anesthesiology. Out of the hundreds of possible cognitive biases, the authors generated a top 14 catalog prevalent in anesthesiology (Br J Anaesth 2012;108:229-35) (Table). They subsequently scored the cognitive errors most observed in simulated scenarios. Premature closure and confirmation bias were the most observed, and framing effect and availability bias were the least observed.

Table: Catalog of Top Cognitive Biases (adapted from Stiegler et al) (Br J Anaesth 2012;108:229-35).

Table: Catalog of Top Cognitive Biases (adapted from Stiegler et al) (Br J Anaesth 2012;108:229-35).

In a study of the surgical literature, Armstrong and colleagues found that the most common types of bias were overconfidence, anchoring, and confirmation bias (Br J Surg 2023;110:645-54).

Knowing that cognitive bias is common and often leads to patient safety concerns should lead us to find mitigation techniques. Croskerry advocates for raising awareness of biases, using decision-making aids and algorithms, and fostering a culture of open-mindedness and self-reflection within clinical practice (Diagnosis 2014;1:23-7). Royce suggests considering open communication opportunities such as diagnostic time outs for challenging cases, and Singh suggests avoiding the view that diagnostic errors are personal failings and suggests changing the term “diagnostic error” to “missed opportunities in diagnosis” (Acad Med 2019;94:187-94; Jt Comm J Qual Patient Saf 2014;40:99-101).

Steps to mitigate errors due to cognitive bias could include (Cogn Res Princ Implic 2023;8:13):

  1. Raising awareness by defining and discussing the terms heuristics and bias.
  2. Avoiding associated negativity of the terms to reduce feelings of weakness, poor judgment, or lack of objectivity.
  3. Making cognitive processing less invisible by practicing metacognition, reflective practice, and cognitive bias awareness and challenging your own biases, becoming a more critical thinker.
  4. Acknowledging that some unknowns are unknown.
  5. Considering application of Bayesian reasoning and artificial intelligence.

We must acknowledge the contributions of bias to our medical decision-making. We wish to believe that we have the right answers, yet our colleagues might offer alternate choices. Embrace the opportunity to use metacognition techniques to think critically, be open to alternate analyses, and be reflective in your practice of medical decision-making.