In my last post, we talked about the
need for hospitals to become highly reliable organizations (HROs). Reducing medical errors to zero takes a huge commitment from both the hospital and the individuals who care for patients. Today, we'll look at a simple technique providers can use to provide an extra margin of safety.
Decision Time
This post will focus on disposition, the process of releasing control of the patient. Depending on your discipline, that could mean discharging them or transferring their care to another provider, department or setting. It's is a risky time, because for a little while, you're still responsible for any harm the patient experiences.
Here's an example. A 45-year-old man presents to the emergency department (ED) with nonexertional chest pain and shortness of breath. His ECG shows some nonspecific changes, and his troponin test is negative. He's still having some atypical chest pain, so the physician orders a repeat ECG and trop. The physician reads the second ECG quickly while walking to another patient’s room. Afterwards, he quickly checks the labs, sees that the troponin is negative and sends the patient home.
Sadly, the patient collapses and returns to the ED in full cardiac arrest. Peer review uncovers some concerning changes between the first and second ECGs. The doctor missed them because he never compared them side-by-side.
Also, the repeat troponin was actually positive. In a rush, the doctor accidentally glanced at the first troponin test, not the second one.
This tragedy was preventable. All of the necessary data was right in front of the physician. What was missing was a built-in safety mechanism to help the (imperfectly human) physician catch his own omissions and biases.
How We Decide
Unfortunately,
errors aren't all that rare in medicine. Common examples include:
- Prescribing medication to which a patient is allergic
- Sending a patient home, only to discover later that labs or vital signs were in fact missing or abnormal
- Going down the wrong diagnostic path
These errors often seem obvious in retrospect. During
peer review, when all the data is presented side by side in a concise format, sometimes we can’t believe we came to the conclusion we did. All the information was there, yet we overlooked key details or focused on the wrong symptoms.
In order to understand why this happens, let’s look briefly at how we make decisions.
Healthcare providers are trained to recognize patterns and associations that either quickly lead us to a diagnosis or direct us down a certain decision-making tree. Certain indicators set off our internal alarms for sepsis, STEMI and stroke. This allows us to quickly determine what tests to order and what initial treatment to provide.
In other words, it helps get things started. The problem is that sometimes our critical thinking stops there.
Pattern recognition is arguably less useful today than it was a decade ago. Patients are now more complicated. They're living longer with multiple chronic diseases and taking many more medications. Because of this complexity, they can show a variety of presentations for the same disease.
Another limitation of pattern recognition: it has a built in margin of error that's unacceptable in an HRO. As stated in my previous post, 99 percent safe is not safe enough in today's healthcare environment. And we'd all probably agree it's not the standard of care we'd want for ourselves and our loved ones.
Finally, pattern recognition can lead us down the wrong diagnostic road. Some common logic errors that interfere with medical decision-making:
- Cognitive bias. This is the human tendency to draw incorrect conclusions based on cognitive factors rather than evidence. (Wow, this really looks like chronic back pain to me) While the hypothesis may be reasonable and provide a starting point for the evaluation, it's important to continue evaluating the evidence as it becomes available.
- Confirmation bias. The tendency to look specifically for evidence that confirms one's initial hypothesis. (This is probably chronic back pain, and the fact that they just helped a friend move proves it…)
- Anchoring bias. The tendency to fixate on specific features of a presentation too early in the diagnostic process. (Since this patient has a history of chronic back pain plus an instigating factor, I am done thinking.) When we fixate too early, we often miss or ignore other important data.
Building in Safeguards
Fortunately, there's a simple strategy we can use at the point of disposition to review the information, reconcile any differences, reduce bias and make sure the clinical picture is complete.
That strategy is called a cognitive pause, and it is the marriage between checklists and critical thinking. Before dispositioning a patient, taking a cognitive pause allows you to review the data, challenge your own diagnosis and consider "can’t miss" diagnoses.
Mark Jaben, MD, has written an
excellent, detailed post about how to apply cognitive pause in the ED. But basically, it's a two-part process that can be adapted to just about any setting:
- Prior to disposition, stop and review all pertinent data.
- History of present illness. Look for outlier symptoms. Has anything new come to light since the patient came in? Also, is there any evidence to indicate multiple conditions?
- Personal medical history. Did you note any high-risk conditions, medications or social history? If so, did you consider them in your evaluation and treatment?
- Physical examination. Consider pertinent negatives and all positives, even if they don't initially seem important.
- Vital signs. What were they at presentation? Have they changed?
- Labs and radiology. Is everything back? Are there abnormal results?
- Other notes. What additional information has been provided by nurses, paramedics and consultants? Did they catch anything you missed, or was there a change?
- Now it is time to think and challenge your diagnosis.
- How could your diagnosis be wrong? Is there anything that doesn’t fit?
- Are there alternative diagnoses that could be feasible? Do you need more data to rule them out?
- Does this patient have signs and symptoms that could be a representation of a "can’t miss" diagnoses like a vascular emergency, spinal catastrophe, malignant infection, chest emergency, internal hernia (after gastric bypass) or genitourinary emergency?
- Do you need to do anything else?
So, cognitive pause is all about reviewing the data and looking at it in total, as you would in peer review. It seems simplistic, but in my experience, it's very useful for catching details we might otherwise overlook — especially when we're busy.
I recognize that while this sounds great in principle, it's a lot of info to gather and review. I've also found that
EHRs aren’t particularly helpful in consolidating the necessary information in one place, and it's time-consuming to flip from screen to screen during the process. One way I get around this is by creating my own paper form that I fill in as things develop throughout the patient's stay. It literally takes less than a minute per patient, and when it's time for disposition, I have all the information I need at my fingertips. I do this for every patient, and I have actually prevented myself from sending people home with unrepeated, abnormal presenting vital signs and very high lipase levels that took a long time to come back from lab.
Parting Thoughts
As tough as it can be to accept, we healthcare providers are human. We make mistakes. It's important that we pay attention to patient safety and risk — and not just because it impacts hospital finances and our personal liability. We should care because it's the right thing to do.
Cognitive pause is a simple technique we can use to mitigate risk and prevent errors. If you think about it, this kind of process comes naturally. Remember the last time your family went on vacation? You probably did a little pause to make sure you had the wallets and passports and that the kids had packed their underwear. As with gas pumps and gun safeties, it's a simple mechanism that can make a huge difference when there's a crucial decision to be made.
[Image credit:
"20130306-OC-RBN-3935" by
U.S. Department of Agriculture licensed under
CC BY 2.0]