When most scientific studies hit the non-scientific media, people tend to latch on to one particular conclusion and spread it like wildfire. When the results of Oregon's recent experiment with Medicaid expansion came out yesterday, however, each person who wrote about it seemed to take a different approach, highlighting the interesting nuggets (and there are many in this study) that appealed to them the most. Check out some of the headlines used in articles about the study, and see if you can tell what the study is actually concluding:
- Medicaid Expansion May Not Improve Health of Poor in U.S. [Bloomberg]
- Study Finds Health Care Use Rises With Expanded Medicaid [New York Times]
- Bad News for Obamacare [Slate]
- Depression rates for uninsured dropped with Medicaid coverage [The Washington Post]
- Oregon Health Study Shows No Significant Health Impacts from Joining Medicaid [The Daily Beast]
Remember, these headlines are all about the same study. However, most people—particularly conservatives who are opposed to the Affordable Care Act—focused on the first part of the authors' concluding statement in The New England Journal of Medicine, which reads: "This randomized, controlled study showed that Medicaid coverage generated no significant improvements in measured physical health outcomes in the first 2 years." (They mostly ignored the second part of the sentence, but we'll get to that a little later.)
In 2008, Oregon held a lottery to select 10,000 people to be added to its Medicaid program. Because the state selected people by chance and not by economic or health status (out of 90,000 people who applied) it offered researchers an extremely rare and valuable opportunity to do a randomized controlled study on the effects of giving previously uninsured people health care access. The Oregon Health Study Group signed up 12,000 people, half of whom won the lottery and got Medicaid and half who didn't, then followed them over two years to see what happened.
The study ultimately measured a variety of issues, but it's the concrete health outcomes that most intrigued observers, since research in this area is so sparse. As Megan McArdle pointed out in her lengthy breakdown of the study, measuring health insurance outcomes in any serious way is extremely difficult. It's one thing to say that 18,000 people (or 30,000 or 45,000) die every year without health insurance, but there's really no way to know what would have happened to them if they had been insured instead. That's why the random, controlled nature of the Oregon experiment was so crucial. A study like this had only been done once before, and that was a generation ago.
Studying health outcomes for 12,000 people has other challenges, too, since every patient has different needs and illnesses. Noting that one group of patients saw a rise in heart attacks won't tell you much about the effectiveness of early cancer treatments. What the Oregon Study did look at was three very specific measures of health: blood pressure, cholesterol, and HbA1c ("a measure of diabetic blood sugar control.") They're simple, quantifiable, and can be managed through regular treatments and medications. One might also predict that these conditions can be greatly affected by regular access to health care.
The study, conducted by a team of health economists from all across the country, monitored people for two years after being added to Oregon's Medicaid rolls and found "no statistically significant effect" on those three specific blood measures. For many, then, the conclusion was obvious: giving uninsured people Medicaid doesn't magically make them healthier.
In other words, Obamacare was just subjected to a randomized control experiment — the gold standard! -- and failed— James Pethokoukis (@JimPethokoukis) May 1, 2013
Since a major part of Obamacare's plan to get everyone insured involves expanding Medicare, this is considered by some to be an indictment of the whole program. The study "throws a stop sign in front of" it, according to the Cato Institute. McArdle implies that if this study had come out last year like it was supposed to, it might have changed the whole 2012 election. It also gives fuel to people who already think Medicaid is a gigantic folly that's worse than no insurance at all. But blood tests aren't the only measure of health, and improved health was not the only justification that was given for passing Obamacare.
Another huge argument for universal health coverage is cost—if not for the government, than at least for the customers. The Oregon study showed a huge decrease in "catastrophic expenses," meaning medical bills that patients were unable to pay back. Almost no one who has Medicaid goes bankrupt because of medical bills. Debt and out-of-pocket spending went down, people skipped out on fewer bills, and hospitals and doctors got paid what they were owed more often. Even though regular preventative services didn't decrease emergency room visits or hospital admissions, the people who did take advantage of their increased access were able to cover their costs. Even if there was no change in outcomes, that seems like a win for both patients and providers (since it is also likely to help control costs in the long run.)
In addition, patients self-reported that they were happier and in better health than they had ever been, even if they had hadn't been to the doctor yet. When they did go, "observed rates of depression" decreased 30 percent, with almost no increase in the number people taking medication to treat depression. It's almost as if the mere fact of having health coverage acts a placebo to make people feel healthier.
Remember the second part of that conlcusion statement mentioned earlier? After the "no significant improvement" clause, it added that Medicaid "did increase use of health care services, raise rates of diabetes detection and management, lower rates of depression, and reduce financial strain." That alone seems worth the effort, but critics of Obamacare still point to the "opportunity cost" of paying for people to have insurance. Fewer depressed and bankrupt people is nice, but is it worth such a massive expansion of the federal government? Maybe we should just give poor people cash and let them buy their own insurance. (The study also doesn't even attempt to measure the difference between Medicaid and other insurance alternatives.) As McArdle puts it, "maybe what we were always expecting was a $1 trillion program to treat depression. Well, that's not how I remember it." Whether Obamacare is the best way to solve the problem is a much more complicated and tricky debate, but a study with such mixed results, no matter how well done, certainly shouldn't be the final answer against it.
Also, the health care gains in the study were found to be statistically insignificant—but not non-existent. The study did show improvement in all those blood measures, but since we're talking about a few thousand people over the course of just two years, the "confidence" of researchers to declare a health care victory is just too low. (In other words, the study is large enough to be useful, but not large enough to definitively solve the case.)
There's also some debate about what the baseline for those improvements were. Was the blood pressure of patients already pretty normal, making it harder to have to significant improvements? Would they see better results if the study went on longer or if the patients they chose were sicker? Try to make sense of this Twitter argument if you want to see all the caveats and interpretations that the data opens up.
And what happens if those same "insignificant" improvements are multiplied across the entire population of the U.S.? McArdle writes, by way of weighing the cost-benefit analysis, that "by putting 6,400 people onto Medicaid, we may have prevented as many as three strokes every five years, or as few as none." Okay, let's go conservative and say it was one prevented stroke for 6,400 people. Now put 48 million people on Medicaid. That's 7,500 strokes that don't happen. Now wait a decade and see how many lives are different. What would that be worth to taxpayers? That's really tough to say, but we know what it would be worth to the people who didn't end up debilitated by a stroke.
And that's just one of the possible bad outcomes that can befall people. There are plenty of other chronic diseases and adverse outcomes that the study couldn't quantify. (And didn't really try to.)
For laymen, the headline of the Oregon study should be "possibly positive but inconclusive," not "had no effect."— Kevin Drum (@kdrum) May 2, 2013
There's no doubt the study should give Obamacare backers pause, if for no other reason than to wonder if the care provided by Medicaid doesn't need improvement. But in the end there simply isn't enough evidence to indict the whole program or destroy the idea that giving more people health insurance is good. To be fair, it also should not be used to prove that Obamacare is necessary as some liberals might like to do. Even if some of the data is encouraging, the flaws in the study must go both ways.
Even the study's lead author, Harvard professor Katherine Baicker, admits that both sides will find ammunition from the conclusions. Baicker says the study "puts to rest the idea that Medicaid doesn’t help beneficiaries," then adds that "It also puts to rest the notion that expanding Medicaid will improve access to health-care by so much that you reduce chronic disease and save money." If both those things are true, that sounds like it's not putting much of anything to rest.
When someone tells you the Oregon Medicaid study proves the program doesn't work, ask them if they're giving up their health insurance.— Justin Wolfers (@justinwolfers) May 2, 2013
Unfortunately, we may never get the hard evidence needed to put a more definitive cap on the argument. In 2010, Oregon found the money to expand Medicaid even further, finally giving coverage to the people in the control group who were originally denied coverage. That effectively ended the experiment after just two years.
This article is from the archive of our partner The Wire.
We want to hear what you think about this article. Submit a letter to the editor or write to email@example.com.