Accidents happen, but they impact some professions more than others. Identifying why accidents happen, evaluating the aftermath and finding solutions to prevent them was part of the Monday session “Understanding Medical Accidents.”
Michael F. O’Connor, M.D., FCCM, a Professor in the Department of Anesthesia and Critical Care at the University of Chicago, said learning from and preventing medical accidents is difficult.
“In the old days, we used to say that when the writing in charts becomes long and legible, something bad happened. What you do every day is extremely hazardous,” Dr. O’Connor said. “Every time you inject someone with propofol, bad things can happen.”
Dr. O’Connor started the session by looking at the “body count” of medical accidents. Despite two studies in 1984 and 1992 that reported death rates of 100,000 due to medical accidents, he suggested health care is faring better.
“Are we really doing nothing? According to the 2000 Institute of Medicine Report on Medical Errors, the death rate from medical errors has dropped to 25,000,” he said.
Still, he said, health care remains far behind other domains and continues to embrace obsolete ideas about how accidents happen. Further, health care uses a conflicted method for investigating medical accidents, allows political forces to shape the narrative of what happened and the appropriate response, and has learned little from bad outcomes.
Dr. O’Connor spoke from the perspective of the practitioner, who he calls “the sharp end” because of his or her expertise, actions, proximity to errors and results, as well as different accident models. One of these models includes the latent failure model, which is associated with incidents such as chemotherapy overdose, drug interactions and pharmacy operations.
“As the sharp end, you are at the bedside generating care,” he said. “Unfortunately, you did not build the O.R., and you didn’t decide on the supplies or the staffing. But all of the success and failure falls to you.”
In the aftermath of an accident, Dr. O’Connor said “hindsight bias” shapes our understanding of bad outcomes. This begins with pinpointing what happened in the hours after the incident and “preparing the accident story” weeks later. A formal investigation can take months or years. That’s when you find out what really happened, he said. Dr. O’Connor referenced the 1989 Exxon Valdez accident in which the ship’s pilot was accused of being intoxicated. By the end of its investigation, the NTSB report cited a number of other, just as important, causes for the accident.
“Lessons from the latent model are that all accidents have multiple causes,” he said. “No anesthesiologist ‘plans’ to do harm when they start a case. But hindsight bias has an incredibly powerful ability to shape our evaluations and make people at the sharp end look bad. It’s a huge problem.”
Unfortunately, accident prevention is equally complex. In some instances, some failures are less expensive to have than to prevent, he said. He pointed to a number of remedies, some of which include understanding as much as possible, being skeptical of and learning from “first stories” following an accident, rejecting blame and providing training instead and empowering practitioners.
“Organizations create situations and environments where accidents are likely to happen and less likely to happen,” he said. “But practitioners are the most resilient in an organization.”