– William Shakespeare
Behavioral economics is a field that studies the effects of environment, heuristics, social contexts, and psychological influences on decision-making. Daniel Kahneman won the Nobel Memorial Prize in Economic Sciences for his work applying psychological insights to economic theory. His book “Thinking Fast and Slow” is an essential primer on the subject, as is Dan Ariely’s book “Predictably Irrational” (Thinking Fast and Slow. 2011; Predictably Irrational: The Hidden Forces That Shape Our Decisions. 2008). How we make decisions and the environmental factors that influence our choices may play a critical role in health care and anesthesiology. The challenging thing about decisions is that heuristics aid rapid decision-making; however, heuristics can lead us down the wrong path (BMJ 2005;330:781-3). Kahneman described two systems of thinking, System 1, a fast, intuitive, and emotional system, and System 2, a slow, more deliberative process. If a bat and a ball together cost $1.10 and the bat cost $1 more than the ball, how much does the ball cost? We will return to this question (Thinking Fast and Slow. 2011).
Default settings have an outsized influence on our decisions. For example, organ donation rates in some European countries are higher than in others because of changes in the default question asked when applying for a driver’s license. In some countries, the default question states, “Check this box if you want to participate in the organ donation program.” In other countries, it says, “Check this box if you don’t want to participate in the organ donation program.” In both scenarios, most drivers select the default option and don’t check the box, resulting in very different consequences. In one anesthesiology department, ventilators were set with a default PEEP of 0 cm H2O when turned on. This lack of PEEP resulted in many patients being inappropriately ventilated because the clinicians never changed the default setting. Changing the default setting to 4 cm H2O improved ventilation for most patients. Looking for opportunities to set defaults appropriately is low-hanging fruit for improving patient care.
The representativeness heuristic is common in medicine and occurs when we make judgments based on a representative case we have experienced. This may lead to an overestimate of the similarities between our current case and the representative case and disregard for other relevant information. The fact that something is more representative does not necessarily make it more likely, and we often disregard the base rate of a condition. For example, a hypotensive child with a fever is immediately postoperative from craniofacial surgery. Although this scenario could represent sepsis, it is more likely to be postoperative hemorrhage rather than sepsis, given the higher base rate for bleeding after this type of surgery. April is an introvert, loves to read, and is interested in history. She is kind, helpful, and organized and is president of a local book club. Is she more likely to be a doctor or librarian? If you think librarian, you are ignoring the base rate and are a victim of the representativeness heuristic. There are five to 20 times as many physicians as there are librarians in the United States (asamonitor.pub/2SYdmm2).
Many are familiar with the placebo effect, but few know its “evil cousin,” the nocebo effect. The placebo effect occurs when a positive belief results in a positive outcome from an inert medication. In contrast, a nocebo effect occurs when a treatment and a negative belief results in a negative outcome. For example, telling a patient they could have nausea and vomiting makes it more likely they will have it. One study of laboring mothers randomized them to two conditions before placing their epidurals. They were told, “You are going to feel a big sting and burn in your back now, like a bee sting; this is the worst part of the procedure,” versus “We are going to inject the local anesthetic that will numb the area where we are going to do the epidural/spinal anesthesia, and you will be comfortable during the procedure.” Those with the negative language experienced more pain, indicating that we should be careful with the language we use to describe procedures so as not to worsen the patient’s experience.
Framing can affect our decisions. For example, subjects were offered 10% fat ice cream or 90% fat-free ice cream in one experiment. The overwhelming majority chose the 90% fat-free ice cream even though the fat content of the two is identical. How we frame choices or situations matters.
When engaging in challenging tasks, people tend to lose track of elapsed time (task fixation). This effect has played a role in several aviation disasters and airway management in the OR. Therefore, it is crucial to rely on other team members to help fixated clinicians recognize when they are perseverating and causing harm. A related issue is anchoring, which occurs when one relies too heavily on the very first piece of information one learns. Salespeople commonly use this to sell products by anchoring buyers to an unrealistically high price, making a lower but still higher than market price seem more attractive. Anchoring can influence our judgment in the clinical context. For example, mask ventilation of a patient initially is easy and then becomes impossible. Anchoring may cause clinicians to perseverate on returning to the initial condition of easy ventilation when the correct response may be to move on to a more definitive solution such as a surgical airway.
In summary, behavioral economics interventions attempt to make the right thing the easy thing to do. Medicine overemphasizes individual efforts and asks clinicians to try harder. This has patient safety implications. In aviation, errors and near-misses are reported confidentially in the Aviation Safety Reporting System (ASRS). Reports can be anonymous, and pilots are granted immunity provided the incident was reported to the ASRS system. There is no equivalent system in anesthesiology, and consequently, we lack valuable near-miss data in anesthesiology.
Back to our bat and ball problem. If you thought the answer was 10 cents, that was your System 1 speaking. However, if you slowed down and were more deliberative, you would realize the correct answer was 5 cents, which could only come from slowing down and leveraging System 2.
How does one combat these behavioral vulnerabilities? Consider checklists, listen to other team members, flatten hierarchy, and become familiar with the many cognitive biases that influence us every day.