Austerians have had their worst week since the last time GDP numbers came out for a country that's tried austerity.
But this time is, well, different. It's not "just" that southern Europe is stuck in a depression and Britain is stuck in a no-growth trap. It's that the very intellectual foundations of austerity are unraveling. In other words, economists are finding out that austerity doesn't work in practice or in theory.
What a difference an Excel coding error makes.
Austerity has been a policy in search of a justification ever since it began in 2010. Back then, policymakers decided it was time for policy to go back to "normal" even though the economy hadn't, because deficits just felt too big. The only thing they needed was a theory telling them why what they were doing made sense. Of course, this wasn't easy when unemployment was still high, and interest rates couldn't go any lower. Alberto Alesina and Silvia Ardagna took the first stab at it, arguing that reducing deficits would increase confidence and growth in the short-run. But this had the defect of being demonstrably untrue (in addition to being based off a naïve reading of the data). Countries that tried to aggressively cut their deficits amidst their slumps didn't recover; they fell into even deeper slumps.
Enter Carmen Reinhart and Ken Rogoff. They gave austerity a new raison d'être by shifting the debate from the short-to-the-long-run. Reinhart and Rogoff acknowledged austerity would hurt today, but said it would help tomorrow -- if it keeps governments from racking up debt of 90 percent of GDP, at which point growth supposedly slows dramatically. Now, this result was never more than just a correlation -- slow growth more likely causes high debt than the reverse -- but that didn't stop policymakers from imputing totemic significance to it. That is, it became a "fact" that everybody who mattered knew was true.
Except it wasn't. Reinhart and Rogoff goofed. They accidentally excluded some data in one case, and used some wrong data in another; the former because of an Excel snafu. If you correct for these very basic errors, their correlation gets even weaker, and the growth tipping point at 90 percent of GDP disappears. In other words, there's no there there anymore.
Austerity is back to being a policy without a justification. Not only that, but, as Paul Krugman points out, Reinhart and Rogoff's spreadsheet misadventure has been a kind of the-austerians-have-no-clothes moment. It's been enough that even some rather unusual suspects have turned against cutting deficits now. For one, Stanford professor John Taylor claims L'affaire Excel is why the G20, the birthplace of the global austerity movement in 2010, was more muted on fiscal targets recently.
The discovery of errors in the Reinhart-Rogoff paper on the growth-debt nexus is already impacting policy. A participant in last Friday's G20 meetings told me that the error was a factor in the decision to omit specific deficit or debt-to-GDP targets in the G20 communique.
The UK and almost all of Europe have erred in terms of believing that austerity, fiscal austerity in the short term, is the way to produce real growth. It is not. You've got to spend money.Bond investors want growth much like equity investors, and to the extent that too much austerity leads to recession or stagnation then credit spreads widen out -- even if a country can print its own currency and write its own checks. In the long term it is important to be fiscal and austere. It is important to have a relatively average or low rate of debt to GDP. The question in terms of the long term and the short term is how quickly to do it.
Growth vigilantes are the new bond vigilantes. Gross thinks the boom, not the slump, is the time for austerity -- which sounds an awful lot like you-know-who.
The austerity fever has even broken in Europe. At least a bit. Now, eurocrats can't say that austerity has been anything other than the best of all economic policies, but they can loosen the fiscal noose. And that's what they might be doing, by giving countries more time and latitude to hit their deficit targets. Here's how European Commission president José Manuel Barroso framed the issue on Monday:
While [austerity] is fundamentally right, I think it has reached its limits in many aspects. A policy to be successful not only has to be properly designed. It has to have the minimum of political and social support.
That's not much, but it's still much better than the growth-through-austerity plan Eurogroup president Jeroen Dijsselbloem was peddling on ... Saturday.
Now, Reinhart and Rogoff's Excel imbroglio hasn't exactly set off a new Keynesian moment. Governments aren't going to suddenly take advantage of zero interest rates to start spending more to put people back to work. Stimulus is still a four-letter word. Indeed, the euro zone, Britain, and, to a lesser extent, the United States, are still focussed on reducing deficits above all else. But there's a greater recognition that trying to cut deficits isn't enough to cut debt burdens. You need growth too. In other words, people are remembering that there's a denominator in the debt-to-GDP ratio.
But austerity doesn't just have a math problem. It has an image problem too. Just a week ago, Reinhart and Rogoff's work was the one commandment of austerity: Thou shall not run up debt in excess of 90 percent of GDP. Wisdom didn't get more conventional. What did this matter? Well, as Keynes famously observed, it's better for reputation to fail conventionally than to succeed unconventionally. In other words, elites were happy to pursue obviously failed policies as long as they were the right failed policies.
But now austerity doesn't look so conventional. It looks like the punchline of a bad joke about Excel destroying the global economy. Maybe, just maybe, that will be enough to free us from some defunct economics.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
The results of the referendum are, in theory, not legally binding.
Lest we think the Euroskepticism displayed this week by British voters is new, let me present a scene from the BBC’s Yes, Minister, a comedy about the U.K. civil service’s relationship with a minister. The series ran from 1980 to ’84 (and, yes, it was funny), at a time when the European Union was a mere glint in its founders’ eyes.
The Europe being referred to in the scene is the European Economic Community (EEC), an eventually 12-member bloc established in the mid-1950s, to bring about greater economic integration among its members.
In many ways, the seeds of the U.K.’s Thursday referendum on its membership in the European Union were sown soon after the country joined the now-defunct EEC in 1973. Then, as now, the ruling Conservative Party and opposition Labour, along with the rest of the country, were deeply divided over the issue. In the run-up to the general election the following year, Labour promised in its manifesto to put the U.K.’s EEC membership to a public referendum. Labour eventually came to power and Parliament passed the Referendum Act in 1975, fulfilling that campaign promise. The vote was held on June 5, 1975, and the result was what the political establishment had hoped for: an overwhelming 67 percent of voters supported the country’s EEC membership.
The city is riding high after the NBA final. But with the GOP convention looming, residents are bracing for disappointment.
Cleveland’s in a weird mood.
My son and I attended the Indians game on Father’s Day, the afternoon before game seven of the NBA Finals—which, in retrospect, now seems like it should be blockbustered simply as The Afternoon Before—when the Cavaliers would take on the Golden State Warriors and bring the city its first major-league sports championship in 52 years.
I am 52 years old. I’ve lived in Northeast Ohio all my life. I know what Cleveland feels like. And it’s not this.
In the ballpark that day, 25,269 of us sat watching a pitcher’s duel, and the place was palpably subdued. The announcer and digitized big-screen signage made no acknowledgement of the city’s excitement over the Cavaliers. There were no chants of “Let’s Go Cavs,” no special seventh-inning-stretch cheer for the Indians’ basketball brothers, who play next door in the Quicken Loans Arena, which in a few weeks will host the Republican National Convention.
The June 23 vote represents a huge popular rebellion against a future in which British people feel increasingly crowded within—and even crowded out of—their own country.
I said goodnight to a gloomy party of Leave-minded Londoners a few minutes after midnight. The paper ballots were still being counted by hand. Only the British overseas territory of Gibraltar had reported final results. Yet the assumption of a Remain victory filled the room—and depressed my hosts. One important journalist had received a detailed briefing earlier that evening of the results of the government’s exit polling: 57 percent for Remain.
The polling industry will be one victim of the Brexit vote. A few days before the vote, I met with a pollster who had departed from the cheap and dirty methods of his peers to perform a much more costly survey for a major financial firm. His results showed a comfortable margin for Remain. Ten days later, anyone who heeded his expensive advice suffered the biggest percentage losses since the 2008 financial crisis.
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
The Republican candidate is deeply unpopular, and his Democratic rival is promoting her own version of American nationalism.
American commentators have spent the weekend pondering the similarities between Britain’s vote to leave the European Union and America’s impending vote on whether to take leave it of its senses by electing Donald Trump. The similarities have been well-rehearsed: The supporters of Brexit—like the supporters of Trump--are older, non-college educated, non-urban, distrustful of elites, xenophobic, and nostalgic. Moreover, many British commentators discounted polls showing that Brexit might win just as many American commentators, myself very much included, discounted polls showing that Trump might win the Republican nomination. Brexit may even result in the installation this fall of a new British prime minister, Boris Johnson, who is entertaining, self-promoting, vaguely racist, doughy, and orange. It’s all too familiar.
A hotly contested, supposedly ancient manuscript suggests Christ was married. But believing its origin story—a real-life Da Vinci Code, involving a Harvard professor, a onetime Florida pornographer, and an escape from East Germany—requires a big leap of faith.
On a humid afternoon this past November, I pulled off Interstate 75 into a stretch of Florida pine forest tangled with runaway vines. My GPS was homing in on the house of a man I thought might hold the master key to one of the strangest scholarly mysteries in recent decades: a 1,300-year-old scrap of papyrus that bore the phrase “Jesus said to them, My wife.” The fragment, written in the ancient language of Coptic, had set off shock waves when an eminent Harvard historian of early Christianity, Karen L. King, presented it in September 2012 at a conference in Rome.
Never before had an ancient manuscript alluded to Jesus’s being married. The papyrus’s lines were incomplete, but they seemed to describe a dialogue between Jesus and the apostles over whether his “wife”—possibly Mary Magdalene—was “worthy” of discipleship. Its main point, King argued, was that “women who are wives and mothers can be Jesus’s disciples.” She thought the passage likely figured into ancient debates over whether “marriage or celibacy [was] the ideal mode of Christian life” and, ultimately, whether a person could be both sexual and holy.
Why professors, librarians, and politicians are shunning liberal arts in the name of STEM
I have been going to academic conferences since I was about 12 years old. Not that I am any sort of prodigy—both of my parents are, or were at one point, academics, so I was casually brought along for the ride. I spent the bulk of my time at these conferences in hotel lobbies, transfixed by my Game Boy, waiting for my mother to be done and for it to be dinnertime. As with many things that I was made to do as a child, however, I eventually came to see academic conferences as an integral part of my adult life.
So it was that, last year, I found myself hanging out at the hotel bar at the annual conference of the Modern Language Association, despite the fact that I am not directly involved with academia in any meaningful way. As I sipped my old fashioned, I listened to a conversation between several aging literature professors about the “digital humanities,” which, as far as I could tell, was a needlessly jargonized term for computers in libraries and writing on the Internet. The digital humanities were very “in” at MLA that year. They had the potential, said a white-haired man in a tweed jacket, to modernize and reinvigorate humanistic scholarship, something that all involved seemed to agree was necessary. The bespectacled scholars nodded their heads with solemn understanding, speaking in hushed tones about how they wouldn’t be making any new tenure-track hires that year.
How the Brexit vote activated some of the most politically destabilizing forces threatening the U.K.
Among the uncertainties unleashed by the Brexit referendum, which early Friday morning heralded the United Kingdom’s coming breakup with the European Union, was what happens to the “union” of the United Kingdom itself. Ahead of the vote, marquee campaign themes included, on the “leave” side, the question of the U.K.’s sovereignty within the European Union—specifically its ability to control migration—and, on the “remain” side, the economic benefits of belonging to the world’s largest trading bloc, as well as the potentially catastrophic consequences of withdrawing from it. Many of the key arguments on either side concerned the contours of the U.K.-EU relationship, and quite sensibly so. “Should the United Kingdom remain a member of the European Union or leave the European Union?” was, after all, the precise question people were voting on.
Patrick Griffin, his chief congressional affairs lobbyist, recalls the lead up to the bill’s passage in 1994—and the steep political price that followed.
For those who question whether anything will ever be done to curb the use of military grade weaponry for mass shootings in the United States, history provides some good news—and some bad. The good news is that there is, within the recent past, an example of a president—namely Bill Clinton—who successfully wielded the powers of the White House to institute a partial ban of assault weapons from the nation’s streets. The bad news, however, is that Clinton’s victory proved to be so costly to him and to his party that it stands as an enduring cautionary tale in Washington about the political dangers of taking on the issue of gun control.
In 1994, Clinton signed into law the Public Safety and Recreational Firearms Use Protection Act, placing restrictions on the number of military features a gun could have and banning large capacity magazines for consumer use. Given the potent dynamics of Second Amendment politics, it was a signal accomplishment. Yet the story behind the ban has been largely forgotten since it expired in 2004 and, in part, because the provision was embedded in the larger crime bill.