One of the arguments against letting Medicare bargain down payment rates to below average cost is that providers--especially hospitals--simply shift those costs to someone else.
The proponents of this theory tend to rather overstate the case, implying a nearly 1-for-1 transfer. The economics are rather more complicated than that. It is indisputably true that Medicare, on average, pays less than the average cost of caring for its patients. It is practically a point of pride with them. Conservatives often argue that this must mean that someone else is picking up the slack. But this is not quite true. The average cost of caring for patients is not the marginal cost of caring for them.
Say a hospital has fixed costs of $1 million, and 10,000 patients a year. That means that each patient has to contribute $100 towards fixed costs, plus the entire marginal cost of caring for their individual case. Now imagine that you add 1,000 new patients, but because they all use a powerful health plan that drives a very hard bargain--call it "CareMed"--they only contribute $50 towards fixed costs. Does that mean that the other patients are getting gypped?
On the contrary! The new patients contribute $50,000 towards fixed costs, reducing everyone else's bill by a little bit. It's more complicated in the real world, of course, but it's still a useful model to think about. So how much you think that Medicare is causing cost-shifting depends in part on whether you think that Medicare patients are consuming a lot more services than they otherwise would. I mean, obviously, private patients would spend even less if Medicare covered a higher percentage of the average cost of treating its patients (it's in the mid-90% range). But that doesn't mean you're actually worse off than you would be in a world without Medicare and its devilish bargains. And you'd just have to give back that money in tax dollars.
Nonetheless, it's clear that there is some cost shifting taking place. But it's hard to figure out how much. Vivian Wu has apparently taken a stab at the question, by analyzing what happened after the Balanced Budget Act, and concludes that yes, it is, but it's lower than industry claims:
Wu's main result is that on average prices paid to hospitals by private payers increases by 21 cents in response to each dollar reduction in public revenue. By way of comparison, this 21% rate of cost shift is about half of the lowest estimates produced by industry studies and is far below their common assumptions of 50% to 100%.
....The policy implications are clear. Wu doesn't state them, but I will. Within the range of variation studied by Wu, with respect to hospital payments, overall health costs can be reduced by 79 cents per dollar of Medicare payment reduction, the other 21 cents being shifted to the private sector. However, the more competitive the hospital market the less the cost shift. For some hospitals in some markets Wu found cost shifting rates as low as 5%. Therefore, sound public policy would encourage greater competition among providers (wherever possible) in tandem with reductions in public payments. Doing both concurrently would reduce public health expenditures with minimal impact on private payments.
Kevin Drum is very happy to hear this:
In other words, Lieberman is, at most, 21% right. There's a little bit of cost shifting, but the vast majority of payment reduction actually goes toward reducing payments. Hospitals might not like that, but why shouldn't the rest of us?
I am wont to prefer academic studies to any industry-funded study, and the BBA does provide a nice natural experiment. Nonetheless, this is emphatically not the right conclusion to draw.
For one thing, mindless trend extrapolation is bad, bad, economics. The fact that we only saw cost-shifting of 21 cents on the dollar ten years ago does not mean that we can continue getting those sorts of results ad infinitum. That's why it's pretty easy to get a 5% discount in a jewelry store, and pretty hard to get an 80% discount. The larger a percentage of Medicare recipients in the patient mix, the higher the percentage of costs that hospitals will have to shift, because it will become harder and harder to make matching cuts.
Moreover, 1997 is a particularly bad year to look at, because the 1990s were a highly abnormal decade for healthcare prices. That was the decade of the managed care revolution, when insurance companies started using primary care physicians as gatekeepers to aggressively control costs. Problem: people hated it. People, in their position as voters, got state legislators to undo many of the changes. People, in their position as employers, lobbied their employers for more generous plans. People got what they wanted: faster health care cost inflation.
The Balanced Budget Act's provisions functioned very well in the context of abnormally low cost growth. When health care costs started rising again, this all fell apart, which is why we now have the annual ritual of repealing "automatic" cuts to Medicare physician reimbursement rates. Cost shifting in an era of overall stricter controls on utilization is harder than cost shifting when other purchasers aren't cutting back.
The corollary is that the 79 cents on every dollar we saved back then were not just taken out of hospital profits; they came out of services. There's some evidence that hospital capital investment suffered as a result of the BBA, amid a generalized aging of hospital plant that seems to have been at least somewhat associated with the cost pressure on providers. Maybe you think that new imaging machines or bigger hospital rooms aren't worth what we pay for them. But their utility is not zero, which means that losing them is still a cost. Those 79 cents do not simply materialize out of thin air. We cannot coin free money by putting everyone in the country on Medicare reimbursement rates.
I assume that the rate of cost shifting is well below what the industry claims. But I'd also say 21 cents is a floor, not a ceiling.