It is 1976. Brad Stuart is in his third year of medical school at Stanford, doing his first clinical rotation. He is told to look at an elderly man with advanced lymphoma. The patient is feeble and near death, his bone marrow eviscerated by cancer. The supervising oncologist has ordered a course of chemotherapy using a very toxic investigational drug. Stuart knows enough to feel certain that the treatment will kill the patient, and he does not believe the patient understands this. Like a buck private challenging a colonel, he appeals the decision, but a panel of doctors declines to intervene. Well, Stuart thinks, if it must be done, I will do it myself. He mixes the drug and administers it. The patient says, “That hurts!” A few days later, the man’s bed is empty. What happened? He bled into his brain and died last night. Stuart leaves the room with his fists clenched.
To this day, he believes he killed the patient. “I walked out of that room and said, ‘There has got to be a better way than this,’ ” he told me recently. “I was appalled by how we care for—or, more accurately, fail to care about—people who are near the end of life. We literally treat them to death.”
Here is a puzzling fact: From 1970 until 2009, spending on health care in this country rose by more than 9 percent annually, creating fiscal havoc. But in 2009, 2010, and 2011, health-care spending increased by less than 4 percent a year. What explains the change? The recession surely had something to do with it. But several recent studies have found that the recession is not the whole story. One such study, by the Harvard University economists David Cutler and Nikhil Sahni, estimates that “structural changes” in our health-care system account for more than half of the slowdown.