The only way to close the budget deficit is to close the jobs deficit
It's State of the Union season, which means it's time for the usual suspects to tell President Obama to "go big" on the deficit. Never mind that jobs, not the deficit, top voters' list of priorities, or that austerity has failed everywhere it's been tried recently (includinghere). It's always a good time to lament the lack of bipartisan golf-playing and call for a grand bargain.
But what exactly makes a bargain grand in Washington? It's not just a matter of trading spending cuts for higher taxes. If it were, the combination of the sequester and the fiscal cliff tax deal would count. No, it has to be a specific kind of spending cut. It has to be a cut to social insurance. That's what Obama has offered with chained CPI, which cuts Social Security and raises taxes by using a lower measure of inflation to calculate benefits and brackets, but Republicans and centrist pundits don't think that's enough. They want Obama to increase the Medicare eligibility age from 65 to 67 too. Now, this sounds like the kind of "painful choice" that will put us on the path to fiscal sustainability, but it's not. The Congressional Budget Office figures it will only save about $150 billion over a decade, while, as Matthew Yglesias of Slate points out, costing patients twice that much. (If every state implements Obamacare's Medicaid expansion, it might not be regressive; just wasteful.).
In other words, it's inefficient savings that wouldn't even save all that much.
But wait. What are we even talking about? Are we worried about today's deficit or tomorrow's deficit? Today's deficit is about unemployment, full stop. Tomorrow's deficit is about rising healthcare costs amidst an aging society. These problems have nothing to do with each other. Which one are we trying to solve right now?
Okay, wait again. I can hear you saying But the deficit is about too much spending, not too much unemployment. And that brings us to the most important chart about the deficit you'll ever see. As Joe Weisenthal of Business Insider points out with the graph below of (inverted) surplus-or-deficit-as-a-share-of-GDP and unemployment, there's historically been a pretty correlation between them. Whether unemployment spikes or recedes, deficits follow.
(Note: The blue line shows the surplus-or-deficit-as-share-of-GDP inverted, and the red line shows the unemployment rate).
Unemployment isn't just a human disaster. It's a fiscal one too. Higher unemployment means lower tax revenue, and higher spending on safety net programs like food stamps -- that is, bigger deficits. And that means bringing down unemployment is the only way to bring down the deficit. Trying to slash the deficit during a depression -- in other words, a liquidity trap -- will only make unemployment worse, and hence leave the deficit little, if at all, better (and perhaps worse). This is hardly a novel insight. As Mike Konczal of the Roosevelt Institute discovered, John Maynard Keynes said as much all the way back in 1933, when he said policymakers just need to "look after unemployment, and the Budget will look after itself."
In other words, unemployment hawks are the real deficit hawks.
There's an irony here. The people who care about the deficit in the long run want to increase it in the short run. (As Brad DeLong asks, who said Keynes didn't care about the long run?). More infrastructure spending, more payroll tax cuts, and more debt writedowns and refinancings are the best ways to put people back to work now that the Fed is doing about as much as it's going to do (though it should do more). All of those things mean bigger deficits today, but bigger deficits today are worth a recovery tomorrow.
And no, austerity would not be some kind of magical elixir -- a stimulant, if you will -- for "confidence". With interest rates stuck at zero, austerity has only hurt growth wherever it's been tried the past few years. The evidence on this from Europe is quite clear, but here's some more, from our side of the pond: Atif Mian of Princeton and Amir Sufi of the University of Chicago recently looked at state-level data in the U.S., and found that too little aggregate demand, not too much uncertainty, is what's holding the economy back today. In other words, businesses are worried where their customers are going to come from, not where their taxes are going to go. Trying to cut our way to confidence won't help when that isn't the problem. It will only make our real problem -- too little demand -- worse.
That doesn't mean we shouldn't worry about long-term healthcare costs. It's just not clear how much we should worry about it. As former OMB director Peter Orszag points out, national healthcare inflation slowed to 3.8 percent in 2012 after annually increasing by more than 10 percent much of the preceding decade. Now, as Annie Lowrey of The New York Times explains, it's something of a mystery what is going on here-- is this slowdown just due to the Great Recession, or is it something else? -- but the takeaway is we have to bit more time than we thought to figure out how to keep bending the cost curve.
There are three, hardly mutually exclusive, endgames when it comes to containing healthcare costs: (1) the cost-controls in Obamacare, like IPAB, work; (2) the government uses Medicare's bargaining power to negotiate better prices from doctors and drug-makers; or (3) the government voucherizes Medicare, and hopes competition keeps prices down. This last option sounds great -- who doesn't like competition? -- but, as economist Kenneth Arrow famously argued, the healthcare market doesn't work like other markets. "Consumers" -- that is, patients -- don't exactly have the expertise to shop around for the best deal on, say, heart surgery. Nor do they decide what to pay for. Insurers do that. There's little empirical reason to expect big savings out of increased competition -- with plenty of potential downside if the vouchers don't turn out to be generous enough.
This is the debate over Medicare's future, not whether to increase the eligibility or not. But it's tomorrow's debate. Today's debate is what we can do to put people back to work. The former is hard enough, without making it a pre-condition for solving the latter.
Jobs are the only thing that will make the state of our deficit better.
The kerfuffle over Kim Kardashian's drug-promoting Instagram selfie is nothing new: As long as the agency has existed, it's had to figure out how to regulate drug advertisements in new forms of communication technology.
Last month, celebrity-news and health-policy bloggers had a rare moment of overlap after the Food and Drug Administration issued a warning letter to the pharmaceutical company Duchesnay, which manufactures Diclegis, a prescription-only anti-nausea pill. At stake: a single selfie with pill bottle.
The image that attracted the censure of the FDA was an Instagram posted on July 20 by Kim Kardashian. The image featured her upper torso, right hand, and face, with a bottle of Diclegis prominently displayed in her grasp. “OMG,” the caption began:
Have you heard about this? As you guys know my #morningsickness has been pretty bad. I tried changing things about my lifestyle and my diet, but nothing helped, so I talked to my doctor. He prescribed my Diclegis, I felt better, and most importantly it’s been studied and there is no increased risk to the baby.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
The June 23 vote represents a huge popular rebellion against a future in which British people feel increasingly crowded within—and even crowded out of—their own country.
I said goodnight to a gloomy party of Leave-minded Londoners a few minutes after midnight. The paper ballots were still being counted by hand. Only the British overseas territory of Gibraltar had reported final results. Yet the assumption of a Remain victory filled the room—and depressed my hosts. One important journalist had received a detailed briefing earlier that evening of the results of the government’s exit polling: 57 percent for Remain.
The polling industry will be one victim of the Brexit vote. A few days before the vote, I met with a pollster who had departed from the cheap and dirty methods of his peers to perform a much more costly survey for a major financial firm. His results showed a comfortable margin for Remain. Ten days later, anyone who heeded his expensive advice suffered the biggest percentage losses since the 2008 financial crisis.
Demographic data shows that a Briton’s education level may be the strongest indication of how he or she voted.
Britain has voted to leave the European Union. The news surprised many people, including the British, who have learned that while brushing off early statistical warnings is tempting, it doesn’t make it any easier when those warnings turn out to be right. Give yourselves a break, I say: Polls are fickle, anecdote is limited, and prevailing wisdom is sometimes impossible to shake. (Though these remorseful Brexit voters don’t have an excuse.)
There’s a silver lining for statistics, however. With the close of Britain’s referendum, political analysts now have a concrete dataset to examine: the actual vote totals in the United Kingdom. This data, when matched with regional demographic information from the U.K. Census, gives insight into who actually voted to leave or remain.
Patrick Griffin, his chief congressional affairs lobbyist, recalls the lead up to the bill’s passage in 1994—and the steep political price that followed.
For those who question whether anything will ever be done to curb the use of military grade weaponry for mass shootings in the United States, history provides some good news—and some bad. The good news is that there is, within the recent past, an example of a president—namely Bill Clinton—who successfully wielded the powers of the White House to institute a partial ban of assault weapons from the nation’s streets. The bad news, however, is that Clinton’s victory proved to be so costly to him and to his party that it stands as an enduring cautionary tale in Washington about the political dangers of taking on the issue of gun control.
In 1994, Clinton signed into law the Public Safety and Recreational Firearms Use Protection Act, placing restrictions on the number of military features a gun could have and banning large capacity magazines for consumer use. Given the potent dynamics of Second Amendment politics, it was a signal accomplishment. Yet the story behind the ban has been largely forgotten since it expired in 2004 and, in part, because the provision was embedded in the larger crime bill.
How the Brexit vote activated some of the most politically destabilizing forces threatening the U.K.
Among the uncertainties unleashed by the Brexit referendum, which early Friday morning heralded the United Kingdom’s coming breakup with the European Union, was what happens to the “union” of the United Kingdom itself. Ahead of the vote, marquee campaign themes included, on the “leave” side, the question of the U.K.’s sovereignty within the European Union—specifically its ability to control migration—and, on the “remain” side, the economic benefits of belonging to the world’s largest trading bloc, as well as the potentially catastrophic consequences of withdrawing from it. Many of the key arguments on either side concerned the contours of the U.K.-EU relationship, and quite sensibly so. “Should the United Kingdom remain a member of the European Union or leave the European Union?” was, after all, the precise question people were voting on.
Thoughts on the first episode of ESPN’s five-part documentary
Every fall Sunday, when I was a kid, half an hour before the pre-game shows and an hour before the games themselves, I would tune into the latest offering from NFL Films. This was the pre-pre-game show—an assembly of short films derived from the massive archive of professional football. Steve Sabol, whose father founded NFL Films, would preside. He’d offer and then throw it to Jon Facenda or Jefferson Kaye, who would narrate the career highlights of players likeGale Sayers, Earl Campbell, or Dick “Night Train” Lane.
“Highlights” understates what NFL films was actually doing. The shorts were drawn from some the most beautifully shot footage in all of sports. It wasn’t unheard of for NFL Films to go high concept—this piece on football and ballet, with cameos from Allen Ginsberg and George Will, may be the definitive example. Great football plays would be injected not with the normal hurrahs, but with poetry. When Facenda, for instance, wanted to introduce a spectacular touchdown run by Marcus Allen, he did so in the omniscient third person: “On came Marcus Allen—running with the night.”
The U.K.’s vote to leave the European Union betrays a failure of empathy and imagination among its leaders. Will America’s political establishment fare any better?
If there is a regnant consensus among the men and women who steer the Western world, it is this: The globe is flattening. Borders are crumbling. Identities are fluid. Commerce and communications form the warp and woof, weaving nations into the tight fabric of a global economy. People are free to pursue opportunity, enriching their new homes culturally and economically. There may be painful dislocations along the way, but the benefits of globalization heavily outweigh its costs. And those who cannot see this, those who would resist it, those who would undo it—they are ignorant of their own interests, bigoted, xenophobic, and backward.
So entrenched is this consensus that, for decades, in most Western democracies, few mainstream political parties have thought to challenge it. They have left it to the politicians on the margins of the left and the right to give voice to such sentiments—and voicing such sentiments relegated politicians to the margins of political life.