Behind the news cycle, however, there’s a deeper issue than what Walmart or McDonald’s pay their workers today. Americans are once again wrestling with what they fundamentally want from the social contract—the basic bargain most of us can expect from the economy throughout our lives.
A generation ago, the country’s social contract was premised on higher wages and reliable benefits, provided chiefly by employers. In recent decades, we’ve moved to a system where low wages are supposed to be made bearable by low consumer prices and a hodgepodge of government assistance programs. But as dissatisfaction with this arrangement has grown, it is time to look back at how we got here and imagine what the next stage of the social contract might be.
The story of the modern social contract can be divided into two parts, with the first beginning in the aftermath of the Great Depression. The New Deal era of the 1930s through the 1970s was largely defined by high and rising wages, which were pushed up by strong unions, limited global competition, low energy and commodity prices, and more stringent regulations on businesses. At the same time, the ability to automate and innovate in the dominant manufacturing sector made it possible to offer workers high pay while keeping prices on consumer goods low.
But the social contract didn’t just encompass paychecks. During the mid-century boom, many employers—led by industrial giants like General Motors and General Electric—acted as “welfare capitalists” that were also primarily responsible for providing benefits like a pension to workers and their families. Part of the motivation was cultural: Before the notion of shareholder capitalism took root in the 1980s, companies viewed it as part of their mission to act in the interests of all of their stakeholders, including workers and their communities, rather than in the interests of investors alone. However, companies also favored the arrangement because providing benefits to workers directly gave them some leverage against labor unions. Ultimately, the welfare-capitalist social contract became the norm.
Starting in the 1980s, however, the social contract underwent a profound change. Deregulation of industry, increasing global competition, and the increasing cost and volatility of raw materials all led companies to move away from the New Deal era consensus. In its place grew what we term the “low-wage social contract” that has dominated through the current day.
After the New Deal, a Worse Deal
The low-wage social contract seeks to balance poor private sector pay with cheap consumer goods, low taxes, and government subsidies that boost after-tax incomes. What does this mean in practice? Cheap imports from countries like China are one big part of it, as are policies like the Earned Income Tax Credit and Child Tax Credit that allow Washington to supplement low-income workers’ pay through the tax code.
Proponents of the low-wage social contract on both the left and the right have argued that the combination of inexpensive goods and low taxes should give consumers more spending power than they would have in a high-wage, high-price economy. In a famous paper entitled “Wal-Mart: A Progressive Success Story,” Jason Furman, now Chairman of the President’s Council of Economic Advisors, argued that the low-wage model actually made low-income consumers better off overall.
For many, though, the bargain has clearly failed. It is true that tax credits and cheap goods have boosted the standard of living for otherwise impoverished workers. Yet, according to the Census Bureau’s Supplemental Poverty Measure, which takes into account wage subsidies and additional costs like taxes and medical costs, almost 10 percent of the total working population still lives in poverty. This includes roughly 5 million Americans who work full-time, year-round.
A key reason for this is that the low-wage social contract does not do much to help families in the areas they need most. Clothing, food, and other items found at Wal-Mart might be cheap for low-wage workers. But other necessary services—health care, daycare, eldercare, and college—have simultaneously become less affordable and more important as most mothers work outside of the home and the wage premium for college remains high. In 1960, the average family spent about $12,000 in inflation-adjusted dollars on childcare, education, and healthcare over the course of 17 years raising a child. Four decades later, the average family spends almost $63,000 per child. Medical out-of-pocket expenses now push more people below the poverty line than tax credits can lift above it.