Who Defines 'Medical Necessity'?
Many states require insurance to pay for seeing chiropractors. A few cover marriage counseling, and a handful reimburse for massage therapy. Increasingly, health experts rely on the political system to decide what should be reimbursable by insurers, Medicare, and Medicaid. The results haven't been promising for the expanding definition of care.

Lounging on the beach one afternoon, my wife suggested that health insurers be required to reimburse women for breast reshaping after childbirth. After all, we treat delivery as a medical procedure and recommend breast-feeding for the health of infants. Why shouldn't fixing a "side effect" of this necessary biological activity—sagging breasts—also be deserving of insurance coverage? Reconstructive surgery after other medical procedures is reimbursed.
In my gut, something tells me breast reshaping isn't really healthcare. But why? We already reimburse for a broad variety of cosmetic procedures, usually to fix a congenital deformity, an injury, or the effects of a disease. And no one disputes that breast reconstruction after a mastectomy is a reimbursable procedure, even though its primary function is cosmetic.
We have a vague definition of medical necessity in the back of our minds: if the mastectomy was necessary, doesn't that mean the reconstruction is, too? Should we pay for prosthetic limbs only if they are functional, or are cosmetic attributes alone worthy of reimbursement? If cosmetic surgery helps a woman develop greater self-esteem or avoid postpartum blues, wouldn't it serve the same purpose as an antidepressant? And following that logic, shouldn't it be reimbursed just like a prescription?
Increasingly, health experts rely on the political system to answer the difficult questions of what should be reimbursable by insurers, Medicare, and Medicaid, but the results haven't been promising in terms of consistency or principle, not to mention control over the expanding definition of care. The fifty states have imposed on health insurers more than 2,000 mandates—requirements to reimburse certain procedures—and the regulations required by the Affordable Care Act will include additional mandates on a national level.
Many of these mandates cover treatments that used to be thought of as cosmetic, optional, or at the very least not medically necessary. In 2008, ten states required coverage for hair prostheses; thirteen for in vitro fertilization. Thirty-one mandated contraceptive reimbursement. Forty-six required reimbursement for the services of chiropractors; fourteen for marriage counselors; and four for massage therapists. Arizona mandated the cost of athletic training. The issue isn't whether any or all of these treatments are good or useful: the question is whether we should all be required to pay for some who want them.
Deepak Chopra has said that insurers should cover meditation classes: "If insurance companies paid for lifestyle-management classes, they would save huge sums of money." Almost every request for a new mandate claims it will save money, yet the amount we spend on care keeps rising. But Chopra's comment illustrates the fundamental principle we now apply to judging whether something should be reimbursed: Not, Is it worth the money? But, Is it good for us?
The traditional understanding of healthcare is that people get sick and medicine provides a cure. Today, that order is often reversed. With society's willingness to pay for ever more care—a willingness demonstrated by the 45-year increase of our spending from $42 billion to more than $2.5 trillion—much of the innovation in healthcare is now about the simultaneous search for new treatments and new conditions that require these treatments. It's not that these new conditions are somehow fake illnesses. Rather, illness is increasingly recognized and often only named when a treatment becomes available.
Erectile-dysfunction (E.D.) medications have all the trappings of healthcare. They require prescriptions written by licensed physicians. They look like any other type of medicine, packaged in the iconic plastic prescription bottles. Medicare (and sometimes Medicaid) and many private insurers will cover E.D. drugs; Viagra and its competitors can legitimately be expensed against tax-advantaged flexible spending and health savings accounts.
Viagra is a classic example of why we seem to need more healthcare even as we get healthier. Before the treatment was available, most male impotence was seen as a consequence of age. Don’t get me wrong: Improving the sex lives of older males is a clear social good. But when we first decided to subsidize all healthcare expenses, would we have considered this problem a health issue?
As we've expanded our willingness to pay for care—through private actions and government support—healthcare as an industry has met the challenge. It's proved able to absorb our trillions in additional dollars by charging higher prices, convincing us that more expensive options provide better results, and expanding our definitions of "need." In other words, healthcare has done what any industry does to increase its market and revenue base in the face of rising consumer demand.
But healthcare as an industry isn't quite like consumer products or automobiles or food. Sure, Procter & Gamble, Ford, and General Mills try to grow by raising prices, introducing new and improved versions of existing products, and extending their product lines. But they must do so in a constant give-and-take with the consumer, overcoming natural consumer resistance to spending more money; what makes healthcare unique is the absence of this consumer in the equation. So healthcare companies can raise prices, introduce "better" products, and expand the definition of what your health requires without the typical consumer resistance—without needing to prove that a new product is worth a high price.
The most important strategy of the healthcare industry has been to endlessly increase our demand for healthcare. To maintain its continued access to the most generous of customers—private insurers, Medicare, and Medicaid—the healthcare industry must convince us that its services fulfill genuine needs, not merely wants, as all other goods and services do. And once a treatment is considered a need, how can those customers possibly argue it isn't worth paying for?
The precise definition of a chronic disease varies, but these ailments are usually identified as long lasting, noncontagious, and resistant to cure. Chronic diseases can range from the primarily annoying (hay fever), to the frequently debilitating (arthritis, diabetes, schizophrenia), to the potentially fatal (heart disease, cancer).
Increasingly, patients refer to chronic "conditions" instead of "diseases," especially when describing their growth. Roughly half of American adults have been diagnosed with at least one chronic condition; a 2007 report suggests that the number of diagnoses will increase by 42 percent in the next fifteen years. Some of this rapid growth of diagnosed chronic conditions relates to the aging of our population: the average American over 65 has at least two identified conditions. But even among younger people—in fact, in every measured age group—chronic conditions are flourishing.
On the surface, this epidemic of chronic conditions makes little sense. We smoke less, drink less, and work in less physically stressful jobs. Seniors are retiring healthier than they have in any previous generation. How is it possible that we're so much healthier yet have so many more chronic conditions?
I suspect one hint is in the greater use of the word "condition." Increasingly, we've come to describe a broad range of symptomless abnormalities and markers of potential disease as chronic conditions, especially if there is a treatment to manage them. As in the introduction of E.D. as a diagnosis, a growing ability to treat—specifically, to manage health issues without curing them—drives an expanding definition of illness.
For example, hypertension is widely recognized as one of the most serious chronic conditions, often leading to heart disease and strokes. It was first recognized in the nineteenth century, but effective treatment is more recent. The spread of blood pressure testing and the introduction of new drugs have allowed hypertension to be managed medically, undoubtedly contributing to the sizable declines in heart attacks and strokes over the past fifty years. But our ability to treat high blood pressure medically has led to an expansion of its diagnosis: nearly a third of American adults today have diagnosable hypertension.
According to a recent article in The Wall Street Journal, doctors have now identified a separate issue from hypertension that they call prehypertension—a combination of elevated blood pressure and lifestyle risk factors that may lead directly to heart disease and strokes. Prehypertension doesn't feature all of the usual symptoms of hypertension, so it is regarded as a separate condition. Doctors are still debating the optimal treatment for prehypertension: traditional recommended lifestyle changes (dieting, exercise, alcohol moderation, reduced salt intake) or drugs usually prescribed for high blood pressure. The article notes that while many doctors prefer lifestyle alterations because they have longer-term health benefits for patients, a 2006 study showed clear benefits of medication for prehypertension.
I myself am a typical example of this. Like an estimated one in four Americans over 45, I take a statin daily to control my cholesterol; an elevated level of LDL (or what's commonly known as "bad") cholesterol is often classified as a chronic condition. My cardiologist told me that changes in my diet—more fish, fruits, and vegetables; less ice cream, burgers, and doughnuts—might alone be sufficient to lower the LDL level, but he felt that the drug was a safe alternative. So I take the statin, and it works (at least in the sense of lowering my cholesterol; whether it will do anything for my risk of heart disease remains an open question).
Before the invention of statins, someone in my position would have needed to change his diet to reduce the risk of heart disease. The invention of statins provided not only a medical alternative but an alternative that others—through insurance—will help me pay for. My fellow insured Americans would not have been willing to subsidize more fish and vegetables in my grocery cart, but insurance requires them to share the cost of my pharmaceutical alternative.
It's worth thinking about my personal decision for a moment. When I chose a statin, society subsidized that choice at the point of purchase through the tax advantages afforded insurance and my health saving account. If I were a Medicare or Medicaid beneficiary, society's subsidy would have been more direct. Since we all ultimately pay for these subsidies, we should all care about the choices our policies encourage.
Many of us suffering with chronic conditions have very serious medical issues for which treatment has been essential to survival. And most of us may benefit from treatment: chronic-condition care isn't a fraud, and for some, treatment now could save larger costs down the road. But the very explosion of these diagnoses during a time of vastly improving health should challenge our concept of healthcare "need."
I don't believe in conspiracies, but it's necessary to point out that chronic conditions may provide the perfect business model for the healthcare industry. Healthcare providers now have the equivalent of cheap razors with expensive blades: you'll need to buy refills for these forever. Like a game of healthcare Whac-A-Mole, the better we get at care, the more conditions we will identify. The more illnesses we can cure, the more in need of treatment we will be. Chronic conditions will continue to replace curable ones until we all require medical management of some form or another.
No one was conspiring to broaden healthcare's reach—the industry has branched out, expanding its product line from the urgent treatment of illness to the much broader management of well-being and comfort. Redefining healthcare was a gradual and disaggregated process, the rational response of thousands of actors—from doctors and hospitals to drug companies and device manufacturers—to the unique incentives presented in our system of healthcare. And for the most part, we still don't see it.
This post is adapted from David Goldhill's Catastrophic Care: Why Everything We Think We Know About Health Care Is Wrong.