Teaching Doctors How to Think
Clinical decision-making will always be imperfect. But there are ways to make it better.
"He who thinks he knows doesn't know. He who knows that he doesn't know, knows." -- Joseph Campbell
I have made clinical errors, also known as mistakes, at various points in my 40-year career. Fortunately, I don't think there have been many. One of them resulted in a long-settled malpractice suit where six different neurologists, including me, missed the diagnosis of rare disease. Presented with the same facts, I admit that I might make the same mistake again. I classify my mistakes in three broad categories:
- Mistakes that didn't cause the patient any harm.
- Mistakes that resulted in serious problems.
- The mistakes I still don't know about because I either never recognized the error or the patient went someplace else.
All doctors make mistakes, because it is impossible for an individual to be perfect - any endeavor that involves humans will involve errors. The man who has the wrong leg removed at surgery makes the headlines of the six o'clock news, but the larger problem resides in the 10 to 15 percent of times where the doctor fails to make the correct diagnosis.
I have always taken pride in the fact that I can trust my clinical judgment, almost always making the right decision at the right time. I sometimes get frustrated watching physicians paralyzed by their indecision. But an article in the New England Journal of Medicine last month has forced me to reconsider my decision-making process. Dr. Pat Croskerry from Dalhousie University in Canada explains that most of our everyday thinking is flawed, and that doctors are no different than the average person. It is not a lack of knowledge (15 years of higher education followed by continuing education requirements take care of that end). The problem lies in the manner in which we approach "clinical thinking."
There are two major ways in which we process information, "intuitive (type I)" and "analytic (type II)" Our Intuitive approach is automatic, and happens at an unconscious level. Croskerry describes this as the "Augenblick diagnosis" or that which is made in the blink of an eye. You see it on television all of the time. The narcotic-popping curmudgeon Dr. House is a great example. No one can figure out what is wrong with the patient, but, through the blur of his own over-medicated psyche, Dr. House instantly makes the rare diagnosis and the day, if not the patient, is saved.
There is a real danger in thinking this way, to zero in on a specific diagnosis or problem and fail to consider other possibilities. The fact is that most physicians who trust their intuition are right most of the time. The vast majority of people coming to my office with a headache will have migraine headaches. My bias will be that you most likely suffer from migraine headaches. I will be right most of the time, but not always - and this automatic, unconscious mode leaves me vulnerable to make a mistake.
The other mode of clinical thinking is Type II, the Analytic process. This is a conscious, slower, and deliberate process that is usually more reliable than the Intuitive process. In this process, we take time to analyze all of the information, order confirmatory tests, consult with colleagues and consider all of the possibilities. Although reliable, it requires a great deal of resources like CT and MRI scans, coronary angiography and numerous vials of blood. In truth, it is just not practical for every patient. We must trust our clinical, intuitive judgment because we cannot order a nuclear cardiac scan or coronary angiogram on every patient with chest pain. However, I know many physicians who order far too many tests. Some claim it is to protect them from a malpractice suit, while others just don't want to miss a diagnosis.
But where is the correct balance between using one's "intuitive" clinical judgment and ordering too many tests under the banner of being an "analytic" diagnostician? It would be inappropriate for me to order an MRI scan of the brain on every patient with a headache. But, I also must get off the autopilot of intuitive thinking and reexamine how I evaluate patients. This is more difficult than it sounds because we are not born to be critical thinkers who can turn off our unconscious intuitive reactions and analyze a situation in a "timely" manner. Dr. Croskerry suggests that medical schools and post graduate medical education need to teach critical thinking as part of their formal curriculum. Doctors must recognize when their biases creep into their decision making and learn to move from their intuitive mode to their analytic mode. This is the elusive "sweet spot" of critical thinking.
Even if I find the balance between intuitive and analytical thinking, I will still make an occasional error and miss a more serious problem, but it should happen less frequently. As hard as it is to accept, the laymen have to understand that bad outcome does not always equal malpractice and that clinical thinking is an imperfect cognitive process.
Dr. Croskerry believes that "all clinicians should develop the habit of conducting regular and frequent surveillance of their intuitive behavior. To paraphrase Socrates, the unexamined thought is not worth thinking."