NYTimes
July 7, 2007
Op-Ed Contributor
Mental Malpractice
By JEROME GROOPMAN
Boston
ONE kind of new year comes for all of us in January — the one we celebrate with Champagne. But another, more stressful new year begins for doctors in July, when the new interns arrive in our emergency rooms, clinics and wards. Hospital personnel have always joked, “Don’t get sick in July,” since for decades the trainees were loosely supervised.
Today, most hospitals closely watch over interns. But at the start of this new medical year, a significant deficiency remains in the system: the way in which doctors are trained to think.
One of my first experiences with the problem came in 1983, during the first week in July as it happens, when my wife, Pam, also a doctor, and I were traveling to Boston from California with our son Steven, then 9 months old. Steve had developed a low-grade fever, had dark and loose stools and was irritable, refusing to nurse. Stopping in Connecticut to visit my in-laws, we consulted the town pediatrician. The doctor quickly dismissed Pam’s concerns. “You’re overanxious,” he told her. “Doctor-parents are like this.”
By the time we arrived in Boston, the baby was ashen and he was jerking his knees to his chest and wailing in pain. We rushed to the emergency room at Children’s Hospital, where a new surgical resident examined him, ordered X-rays and blood tests and made the correct diagnosis: an intussusception, an intestinal obstruction. It was a hectic night, and the novice doctor was being pulled in many directions. He told us there was no urgency to operate and left us alone with our flailing child.
I had worked one year in a research lab at this hospital and phoned the senior hematologist who had been my mentor. He contacted an attending surgeon, who came to the emergency room and whisked Steve to the operating room. “It was fortunate that we operated when we did,” the surgeon told us later. The intestine was at the point of bursting, spilling its contents into the abdomen, precipitating peritonitis and possibly shock.
Today at Children’s Hospital, you no longer need to know a powerful member of the staff. Every intern’s plan of treatment is validated by an attending doctor as part of the “patient safety movement.” Systematic checks and double checks also have been instituted to guard against logistical mistakes, like mixing up blood samples in the laboratory or labeling “left” as “right” on a limb X-ray.
Still, doctors get their diagnoses wrong 15 percent to 20 percent of the time, and half of these mistakes result in serious harm or even death — because the majority of misdiagnoses result from errors in thinking, not logistics.
In analyzing patients’ problems, doctors look for typical signs and symptoms. Often after listening to a patient’s complaints for just 18 seconds, studies show, a doctor will interrupt, having already formulated his or her diagnosis. Too often, shortcuts lead in the wrong direction.
As a young doctor, I had an elderly patient who complained of discomfort under her breastbone. I examined her, performed several tests and quickly concluded that she had indigestion. The antacids I prescribed brought little relief, but my mind was so fixed that her persistent complaints sounded to me like a nail scratching a chalkboard.
Several weeks later, I was paged to the emergency room. The woman was in shock. The discomfort under her breastbone, it turned out, had been caused by a tear in her aorta. After she died, my colleagues commiserated, saying that a torn aorta can be hard to diagnose, that the woman was so old that she probably would not have survived surgery to repair the tear. But that provided cold comfort, and I have never forgotten, nor forgiven myself.
In some hospitals, mistakes are categorized as “E.T.” for errors in technique and “E.J.” for errors in judgment. Errors in technique might involve placing a needle too far into the chest and puncturing a lung or inserting a breathing tube into the esophagus instead of the trachea — mistakes that, with practice, doctors can learn to stop making.
Errors in judgment are not so easily avoided, because we have largely failed to learn anything about how we think. Modern clinical practice has incorporated DNA analysis to illuminate the causes of disease, robotics to facilitate operations in the brain and computers to refine M.R.I. images, but we have paid scant attention to the emerging science of cognitive psychology, which could help us explore how we make decisions.
This science has grown from the work of Amos Tversky and Daniel Kahneman, who some three decades ago began a series of experiments to examine how people make choices when they are uncertain. Economists have used their work to understand why people in the marketplace often make irrational decisions. People invest in a company because their relatives did in the past, for example, or they choose a fund manager simply because he outperformed the market two years in a row.
This growing body of research can illuminate many irrational aspects of medical decision-making, too. The snap judgments that doctors make, for example, can be understood as “anchoring errors”; the first symptoms anchor the doctor’s mind on an incorrect diagnosis. Doctors also fall into a cognitive trap known as “availability,” meaning that we too readily recall our most recent or dramatic clinical experiences and assume they correspond to a new patient’s problem.
We make “affective” errors, too, letting our feelings color our thinking. Such feelings may be drawn from stereotypes — the Connecticut pediatrician casting my wife as overanxious or my viewing my elderly patient as a chronic complainer — or they may be excessively positive. Too much empathy may keep a doctor from performing an uncomfortable procedure that is vital to making the correct diagnosis.
I have started teaching these concepts of cognitive psychology in continuing medical education courses, and recently used my misdiagnosis of the torn aorta to illustrate the common thinking trap. My wife, Pam, has introduced fourth-year medical students at our hospital to the cognitive detours doctors commonly take. But such instruction needs to be widespread. In classes and on hospital rounds, medical schools and hospitals should teach doctors why some diagnoses succeed and why some fail. And as part of the assessment of clinical competency for obtaining a license, doctors should be expected to demonstrate their fluency in the application of cognitive science, as they are required to do in other sciences.
Once we are schooled in the way we think, we will also be better able to answer questions from patients and their families about how we arrive at our diagnoses. And that may make everyone more confident about visiting a clinic or a hospital in July.
Jerome Groopman, a professor of medicine at Harvard, is the author, most recently, of “How Doctors Think.”
No comments:
Post a Comment