One side effect of “the end of babies”—or, less dramatically, the steady decline in fertility rates around the world—is that today’s parents spend more time and money on the few kids they do have.
In the United States, per-child spending doubled from the 1970s to the 2000s, according to a 2013 paper by Sabino Kornich of the University of Sydney and Frank Furstenberg of the University of Pennsylvania. Parents spent more on education, toys, and games. But nothing grew faster than per-child spending on child care, which increased by a factor of 21—or approximately 2,000 percent—in those 40 years.
Although wrapping your head around 2,000 percent growth might be difficult, the underlying cause isn’t so mysterious. As more women entered the labor force in the late 20th century, the work of caring for infants moved from the unpaid world of stay-at-home parents to the world of salaried labor. The 1970s and ’80s—the two decades when the female labor participation rate grew the fastest—also saw the greatest acceleration in child-care spending, according to Kornich and Furstenberg. Raising young children is work—and it always has been work—but the rise of dual-earner households has forced more families to recognize this work with their wallets.
But child-care spending is unlike other spending. By some measures, it’s getting more expensive faster than almost every other consumer good or service that the government tracks. The Census Bureau has found that child-care expenditures rose more than 40 percent from 1990 to 2011, during a period when middle-class wages stagnated. Since the 1990s, child-care costs have grown twice as fast as overall inflation. In California, the cost of a typical day-care center is now equal to almost half of the median income of a single mother.
Pick whatever source and statistic you like, because they all point to the same conclusion: Child care in America has become ludicrously expensive. The average cost of a full-time child-care program in the U.S. is now $16,000 a year—and more, in some states, than tuition at a flagship university.
What the hell is going on? And what should we do about it?
There are three broad reasons American child care now costs the same as buying a brand new Hyundai Elantra every year.
First, although child-care workers aren’t expensive on an hourly basis—their median hourly wage is less than that of non-farm-animal caretakers and janitors—labor is the biggest line item for child-care facilities. Unlike, say, car companies, they can’t cut spending by moving labor to poorer countries or by replacing human workers with machines. Like health care and education, child care requires lots of domestic salaries, which means that its costs will continuously rise faster than overall inflation.
The industry is highly regulated, perhaps reasonably so, given the vulnerability of the clientele—which is the second key driver of child-care costs. As Jordan Weissmann has reported in The Atlantic, states with strict labor laws tend to have the most expensive facilities. In Massachusetts, which requires one caregiver for every three infants, the average annual cost is more than $16,000. In Mississippi, which allows a one-to-five ratio, the cost is less than $5,000. Thanks to high turnover rates—a result of those low wages—companies have to constantly train new workers to meet regulatory standards. Other costs include insurance to cover damage to the property and worker injuries, as well as legal fees to deal with inevitable parent lawsuits.
Finally, there’s the real estate. The most expensive child-care facilities tend to be situated near high-income neighborhoods or in commercial districts, where the rents are high. And they can’t downsize in a pinch, because most states require them to have ample square footage for each kid.
The state of American child care might be defensible if it were expensive and high-quality—or if it were crummy but cheap.
Instead, the U.S. has the worst of both worlds: Cadillac prices for an Edsel product. The typical family paying for any child care spends about 10 percent of their income on it, far more than in most similarly rich countries. But American day care is a shambles. “The overall quality is wildly uneven and barely monitored, and at the lower end, it’s Dickensian,” the health-care writer Jonathan Cohn wrote in 2013. A 2007 review by the National Institute of Child Health and Human Development found that only one in 10 facilities offered “high-quality” care.
As the need for day-care options becomes more severe, some private employers, such as Patagonia, Apple, and Google, are stepping in to offer day-care centers for employees or to pay for “backup child care” if an employee’s first option falls through. New early-childhood startups such as Vivvi offer employer-sponsored child care. And Wonderschool, an “Airbnb for daycare,” helps neighborhoods launch child-care centers in peoples’ homes.
While it’s admirable for companies to fill the day-care vacuum, the absence of a national solution is an indictment of American policy. Neuroscientists and psychologists have established that the first five years of a child’s life are crucial for the development of logic and language skills. Early education has profound effects on both these cognitive skills and “noncognitive” skills, such as grit, teamwork, and emotional health. But these academic findings haven’t translated to policy, at least not in the U.S. Several European nations, such as France and Denmark, spend three to five times more than America on their young children’s care and education.
There is a deep disconnect in the way the U.S. conceives of its obligation to children. Most Americans accept—even demand—the public subsidy of education from the moment kids turn 5 and enter kindergarten to the day they graduate from a state university or community college. But from birth to the fifth birthday, children are on their own—or, more precisely, their parents are. This arrangement is plainly weird: Parents must bear the highest burdens of child-rearing when they are younger, typically poorer, and less established in their career.
In the politics-and-policy world, some are starting to argue that the U.S. desperately needs a comprehensive, research-based approach to caring for young Americans before they turn 5—a First Five Years policy. For example, the People’s Policy Project, a left-wing think tank, has proposed a bundle of early-childhood policies that includes free health care, a child allowance of $300 a month, and a free spot in a public child-care center. (Parents could also receive a direct home-child-care benefit, if they preferred.) Several Democratic presidential candidates have also embraced elements of a First Five Years policy. Elizabeth Warren, for instance, has proposed to spend nearly $2 trillion on a national child-care system.
One simple reason Washington should play a bigger role in child care is that the benefits of early-childhood care and education are so large—and accrue over such a long period of time—that the only institution big enough to capture the upside is the federal government. In 2015, the Council of Economic Advisers wrote that every $1 spent on early-childhood education results in roughly $8.60 of societal benefits, “about half of which comes from increased earnings for children when they grow up.” Similarly, a 2019 Harvard study of dozens of U.S. policies concluded that “direct investments in low-income children’s health and education” have historically had the biggest bang per buck.
There are two broad criticisms of federally sponsored child care. The cultural critique is that by stepping in to play the role of mom and dad, the state would weaken bonds between parents and their children. The rejoinder here is easy: America’s infants are already suffering the effects of insufficient care. Most of the achievement gap between black and white American students is in place by kindergarten. Meanwhile, dozens of studies of preschool programs since the 1960s have shown that early-childhood education can slash the black-white kindergarten achievement gap in half.
The more policy-focused critique is that establishing a national system to carefully watch nearly 10 million tots under the age of 5 would be a logistical hellscape. How would federal, state, and local governments hire millions of caretakers in an economy with 3.5 percent unemployment? Where would they live? “Increased immigration,” you might answer, “and in new affordable housing.” But building a high-quality national caretaking workforce will take years, and shoddy national day care might be worse than the alternative.
An analysis of Quebec’s effort to expand access to cheap child care, for example, found mixed results. Its programs succeeded in raising the labor-force participation rate of mothers without breaking the bank for taxpayers. But young Canadians who were eligible for the program experienced, as teenagers, “a significant worsening in self-reported health and in life satisfaction” relative to Canadians from other provinces. So, did the Quebec child-care experiment “work”? Yes, for parents and public financing. Perhaps not for the kids.
Despite these challenges, the case for an expanded role in federal child care is strong. Spending on young children is more like infrastructure than Social Security. It’s not just a check or a transfer motivated by mere decency, but rather a savvy investment that returns its cost in the form of taxes and social benefits. The deep irony of the high cost of U.S. child care is that the very thing that is bankrupting parents today should represent, to the federal government, a grand-slam investment in the country’s future. Can U.S. families afford to adequately care for their own children? is a great question. But there’s an even better one: Can the U.S. afford not to?