The only way to close the budget deficit is to close the jobs deficit
It's State of the Union season, which means it's time for the usual suspects to tell President Obama to "go big" on the deficit. Never mind that jobs, not the deficit, top voters' list of priorities, or that austerity has failed everywhere it's been tried recently (includinghere). It's always a good time to lament the lack of bipartisan golf-playing and call for a grand bargain.
But what exactly makes a bargain grand in Washington? It's not just a matter of trading spending cuts for higher taxes. If it were, the combination of the sequester and the fiscal cliff tax deal would count. No, it has to be a specific kind of spending cut. It has to be a cut to social insurance. That's what Obama has offered with chained CPI, which cuts Social Security and raises taxes by using a lower measure of inflation to calculate benefits and brackets, but Republicans and centrist pundits don't think that's enough. They want Obama to increase the Medicare eligibility age from 65 to 67 too. Now, this sounds like the kind of "painful choice" that will put us on the path to fiscal sustainability, but it's not. The Congressional Budget Office figures it will only save about $150 billion over a decade, while, as Matthew Yglesias of Slate points out, costing patients twice that much. (If every state implements Obamacare's Medicaid expansion, it might not be regressive; just wasteful.).
In other words, it's inefficient savings that wouldn't even save all that much.
But wait. What are we even talking about? Are we worried about today's deficit or tomorrow's deficit? Today's deficit is about unemployment, full stop. Tomorrow's deficit is about rising healthcare costs amidst an aging society. These problems have nothing to do with each other. Which one are we trying to solve right now?
Okay, wait again. I can hear you saying But the deficit is about too much spending, not too much unemployment. And that brings us to the most important chart about the deficit you'll ever see. As Joe Weisenthal of Business Insider points out with the graph below of (inverted) surplus-or-deficit-as-a-share-of-GDP and unemployment, there's historically been a pretty correlation between them. Whether unemployment spikes or recedes, deficits follow.
(Note: The blue line shows the surplus-or-deficit-as-share-of-GDP inverted, and the red line shows the unemployment rate).
Unemployment isn't just a human disaster. It's a fiscal one too. Higher unemployment means lower tax revenue, and higher spending on safety net programs like food stamps -- that is, bigger deficits. And that means bringing down unemployment is the only way to bring down the deficit. Trying to slash the deficit during a depression -- in other words, a liquidity trap -- will only make unemployment worse, and hence leave the deficit little, if at all, better (and perhaps worse). This is hardly a novel insight. As Mike Konczal of the Roosevelt Institute discovered, John Maynard Keynes said as much all the way back in 1933, when he said policymakers just need to "look after unemployment, and the Budget will look after itself."
In other words, unemployment hawks are the real deficit hawks.
There's an irony here. The people who care about the deficit in the long run want to increase it in the short run. (As Brad DeLong asks, who said Keynes didn't care about the long run?). More infrastructure spending, more payroll tax cuts, and more debt writedowns and refinancings are the best ways to put people back to work now that the Fed is doing about as much as it's going to do (though it should do more). All of those things mean bigger deficits today, but bigger deficits today are worth a recovery tomorrow.
And no, austerity would not be some kind of magical elixir -- a stimulant, if you will -- for "confidence". With interest rates stuck at zero, austerity has only hurt growth wherever it's been tried the past few years. The evidence on this from Europe is quite clear, but here's some more, from our side of the pond: Atif Mian of Princeton and Amir Sufi of the University of Chicago recently looked at state-level data in the U.S., and found that too little aggregate demand, not too much uncertainty, is what's holding the economy back today. In other words, businesses are worried where their customers are going to come from, not where their taxes are going to go. Trying to cut our way to confidence won't help when that isn't the problem. It will only make our real problem -- too little demand -- worse.
That doesn't mean we shouldn't worry about long-term healthcare costs. It's just not clear how much we should worry about it. As former OMB director Peter Orszag points out, national healthcare inflation slowed to 3.8 percent in 2012 after annually increasing by more than 10 percent much of the preceding decade. Now, as Annie Lowrey of The New York Times explains, it's something of a mystery what is going on here-- is this slowdown just due to the Great Recession, or is it something else? -- but the takeaway is we have to bit more time than we thought to figure out how to keep bending the cost curve.
There are three, hardly mutually exclusive, endgames when it comes to containing healthcare costs: (1) the cost-controls in Obamacare, like IPAB, work; (2) the government uses Medicare's bargaining power to negotiate better prices from doctors and drug-makers; or (3) the government voucherizes Medicare, and hopes competition keeps prices down. This last option sounds great -- who doesn't like competition? -- but, as economist Kenneth Arrow famously argued, the healthcare market doesn't work like other markets. "Consumers" -- that is, patients -- don't exactly have the expertise to shop around for the best deal on, say, heart surgery. Nor do they decide what to pay for. Insurers do that. There's little empirical reason to expect big savings out of increased competition -- with plenty of potential downside if the vouchers don't turn out to be generous enough.
This is the debate over Medicare's future, not whether to increase the eligibility or not. But it's tomorrow's debate. Today's debate is what we can do to put people back to work. The former is hard enough, without making it a pre-condition for solving the latter.
Jobs are the only thing that will make the state of our deficit better.
Three Atlantic staffers discuss “Home,” the second episode of the sixth season.
Every week for the sixth season of Game of Thrones, Christopher Orr, Spencer Kornhaber, and Lenika Cruz will be discussing new episodes of the HBO drama. Because no screeners are being made available to critics in advance this year, we'll be posting our thoughts in installments.
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
For some, abandoning expensive urban centers would be a huge financial relief.
Neal Gabler has been a formative writer for me: His Winchell: Gossip, Power, and the Culture of Celebrity was one of the books that led me to think about leaving scholarship behind and write nonfiction instead, and Walt Disney: The Triumph of the American Imagination was the first book I reviewed as a freelance writer. To me, he exemplifies the best mix of intensive archival research and narrative kick.
So reading his recent essay, "The Secret Shame of Middle-Class Americans," was a gut punch: First, I learned about a role model of mine whose talent, in my opinion, should preclude him from financial woes. And, then, I was socked by narcissistic outrage: I, too, struggle with money! I, too, am a failing middle-class American! I, too, am a writer of nonfiction who should be better compensated!
Ted Cruz and John Kasich need big wins in the remaining primaries for a shot at the nomination—too bad most people have already made up their minds.
What a week to be a presidential candidate. Ted Cruz picked former rival Carly Fiorina to be his running mate. John Kasich horse-traded Indiana for two other states. And Donald Trump actually made a policy speech.
Bold moves, fellas! After all, this is usually when campaigns enter naptime; by May, both parties normally have a presumptive nominee and are busying themselves with chair arrangements at the conventions. But as 2016 has already shown, this is An Election Like None Other. If Cruz and Kasich want a chance at winning the nomination, they have to seriously wow voters in these last handful of primaries. They’re clearly taking their best shots.
But here’s the thing: Bold, decisive campaigning only works if there are enough voters left who actually care. There’s still wiggle room in Indiana, where polls say somewhere around 6 percent of voters are undecided. But if I were one of the candidates, I’d be worried about a different figure: the percentage of people who made up their minds more than a month before the election. That number looks decidedly worse.
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. And according to a new report from the U.K.-based Global Challenges Foundation, they’re much more likely than we might think.
In its annual report on “global catastrophic risk,” the nonprofit debuted a startling statistic: Across the span of their lives, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
Partly that’s because the average person will probably not die in an automobile accident. Every year, one in 9,395 people die in a crash; that translates to about a 0.01 percent chance per year. But that chance compounds over the course of a lifetime. At life-long scales, one in 120 Americans die in an accident.
A pastor and a rabbi talk about kids, poop, and tearing down the patriarchy in institutional religion.
The Bible is a man’s book. It was mostly written by men, for men, and about men. The people who then interpreted the text have also been predominately male.
No wonder there’s not much theology preoccupied with weird-colored poop and the best way to weather tantrums. Throughout history, childcare has largely been considered women’s work—and, by extension, not theologically serious.
Danya Ruttenberg—a Conservative rabbi whose book about parenting came out in April—disagrees. So does Bromleigh McCleneghan, a Chicago-area pastor and the author of a 2012 book about parenting and a forthcoming book about Christians and sex. Both women have made their careers in writing and ministry. But they’re also both moms, and they believe the work they do as parents doesn’t have to remain separate from the work they do as theologians.
The U.S. president talks through his hardest decisions about America’s role in the world.
Friday, August 30, 2013, the day the feckless Barack Obama brought to a premature end America’s reign as the world’s sole indispensable superpower—or, alternatively, the day the sagacious Barack Obama peered into the Middle Eastern abyss and stepped back from the consuming void—began with a thundering speech given on Obama’s behalf by his secretary of state, John Kerry, in Washington, D.C. The subject of Kerry’s uncharacteristically Churchillian remarks, delivered in the Treaty Room at the State Department, was the gassing of civilians by the president of Syria, Bashar al-Assad.
The president’s unique approach to the White House Correspondents’ Dinner will surely be missed.
No U.S. President has been a better comedian than Barack Obama. It’s really that simple.
Now that doesn’t mean that some modern-day presidents couldn’t tell a joke. John F. Kennedy, Ronald Reagan, and Bill Clinton excelled at it. But Obama has transformed the way presidents use comedy—not just engaging in self-deprecation or playfully teasing his rivals, but turning his barbed wit on his opponents.
He puts that approach on display every year at the White House Correspondents’ Dinner. This annual tradition, which began in 1921 when 50 journalists (all men) gathered in Washington D.C., has become a showcase for each president’s comedy chops. Some presidents have been bad, some have been good. Obama has been the best. He’s truly the killer comedian in chief.