On at least three occasions in 2007, surgeons at one Rhode Island hospital operated on the wrong side of their patients' heads. In one case, a resident neurosurgeon inserted a scalpel into the head of an 82-year-old patient. The surgeon noticed the error before reaching the skull and stitched up the wound, but the state health department fined the hospital $50,000.
This sort of error is not terrifically rare. Based on malpractice judgments and out-of-court settlements for things like operating on the wrong side of a patient, or on the wrong patient, or leaving a sponge or other surgical object inside of a patient, researchers at Johns Hopkins estimate that such errors—called "never events" by hospital risk managers—occur not never, but more than 4,000 times in the U.S. every year.
In recent years, the vogue solution in preventing this sort of error has been a seemingly simple one: using checklists. The idea is that using checklists could prevent this sort of surgical error, and their associated (often massive) costs.
The idea became widely popular in 2009, after surgeon and Harvard professor Atul Gawande published a book called The Checklist Manifesto that, true to its title, implored doctors to use even basic checklists to avoid egregious errors of omission. And, ideally, all errors. It was based on research he published in the New England Journal of Medicine in January of the same year, which found that implementing a system of checklists to ensure basic safety standards is extremely effective. In fact, use of a straightforward 19-point checklist was able to decrease the rate of death in or after surgery by almost half.
Doctors often have many concerns competing for limited space in their immediate attention. Even when faced with a procedure that they've performed thousands of times, a surgeon can overlook a basic tenet like, for instance, double checking that they are cutting on the correct side of a head. Most checklists also ensure that the necessary instruments, extra blood, and surgical equipment are on hand. They also lay the groundwork for good communication, asking that all members of the surgical team have introduced themselves, and that the surgeon has gone through the critical elements of the procedure with the team prior to cutting.
The extra steps that a checklist requires, Gawande evangelized, can make massive differences in mortality and complication rates.
And so it was believed, and decrees for use of such checklists went out across the land, and were also mandated or strongly encouraged internationally including in the United Kingdom and the Netherlands. Eighty-eight percent of Canadian hospitals now require checklists. The checklist train seemed poised to circle the globe until last week, when the New England Journal of Medicine (the same journal that minted Gawande's checklist meme) published new research that concluded, in a momentous twist of fate, that the checklist movement itself may be an error.
After implementing a checklist system at 101 hospitals in Ontario, Dr. David Urbach and colleagues monitored surgical errors and complications over a three-month period. 106,370 procedures later, the researchers concluded that the checklist implementation "was not associated with significant reductions in operative mortality or complications."
Urbach said the study is a reminder that ensuring patient safety is "not as easy as a checklist.”
So then, have we been wrong about the effectiveness of checklists? Should you not be using a checklist before you do a surgery? Do fewer people die when you do, and do you do fewer accidental surgeries on the wrong side of a patient's head, or on the wrong limb—or is it a waste of time?
Gawande is confident that it is not. He is not swayed by the new research. "I wish the study were better," he told me. "But it's very hard to conclude anything from this study."
For one, it only used three months of data. Implementing a checklist system effectively means changing a culture, Gawande says, which takes time. The Ontario trials also involved no team training and no tracking of how widely the checklist was used. "One thing we know," Gawande told me, "[is that] if you don't use it, it doesn't work. It's like running a drug trial without checking to see if patients actually took the drug."
Many of the Ontario hospitals' staffs said that they did comply with the checklist, but Gawande is skeptical. In a similar U.K. study, he said, hospitals began requiring surgical checklists, and almost all staff said that they had been using them. But when a third party monitored the surgeries, it found that critical elements of the checklist were followed in just 9 percent of cases. As Gawande put it, "resistance by surgical teams was widely evident."
Dr. Lucian Leape of the Harvard School of Public Health wrote in an accompanying editorial, similarly, that fully implementing a checklist system is difficult and takes time. Without measuring compliance from all of the medical staff, the research says next to nothing.
Longer, population-wide implementation of the surgery checklists has been shown in large-scale studies to be feasible and effective. A Veterans Affairs program implemented a checklist at 74 hospitals and saw an 18 percent reduction in annual mortality. In 2008, Scotland began using a surgical checklist program, and death rates decreased by an average of 0.06 percent annually for the first three years, ultimately falling below 0.5 percent for the first time ever in the country's history.
While Leape seems to believe in the value of checklists, he says it's premature to go all the way to requiring their use. "Regulation works best when a practice of unquestioned value has become the norm," Leape wrote. "We are not there yet."
Dr. Thomas Weiser, a surgeon at Stanford, told the Canadian Globe and Mail that his concern about the new study is that it could “arm the naysayers and keep people from implementing the checklist or spreading the checklist in a meaningful way.”
If you are a checklist naysayer, think twice before arming yourself.