The fiscally conservative case to borrow and spend -- and feel good about it
What if borrowing money made you so much richer over the long-term that it paid for itself? It's not crazy. Millions of families make such a decision every year when they take on debt to pay for school. Indeed, investing in yourself is a bet that often pays off. But can the same be true for an entire country?
Brad DeLong and Larry Summers say yes. In a provocative new paper, they argue that when the economy is depressed like today, government spending can be a free lunch. It can pay for itself.
It's a fairly simple story. With interest rates at zero, the normal rules do not apply. Government spending can put people back to work and prevent the long-term unemployed from becoming unemployable. This last point is critical. If people are out of work for too long, they lose skills, which makes employers less likely to hire them, which makes them lose even more skills, and so on, and so on. Even when the economy fully recovers, these workers will stay on the sidelines. It's not just these workers who suffer from being out of work. We all do. High unemployment is a symptom of a collapse in investment. If we don't make needed investments now, that will put a brake on growth down the line. Together, economists call these twin menaces hysteresis. And if it sets in, it reduces how much we can do and make in the future. Assuming that spending now can forestall hysteresis, then this spending might be self-financing. In other words, spending now might "cost" us less than not acting.
This doesn't mean that government spending is magic. Often, it's anything but. But this is a special case. DeLong and Summers identify three factors that determine whether fiscal stimulus will pay for itself: 1) how much hysteresis hurts future output, 2) the inflation-adjusted interest rate, and 3) the size of the fiscal multiplier. Let's consider these in turn.
THE MONSTER OF HYSTERESIS
Economists know a lot about a lot of things. Hysteresis is not one of them.
Indeed, it's not clear whether long-term unemployment and investment shortfalls really do damage potential growth over the really long-term. Maybe hysteresis "only" wounds us for the next 20 years, but not the next 40 years. Unfortunately, there's reason to fear that this is optimistic. A recent paper by Stephen Davis and Till von Wachter finds that workers who are laid off during recessions -- who presumably take longer to find a new job -- take worse hits to their lifetime earnings than do workers who are laid off during good times. Lasting unemployment has lasting consequences. That should terrify our policymakers.
The below chart from DeLong and Summers shows the unemployment rate versus the percentage of working-age people who are actually working. Any divergence between the two shows us how many people have given up on trying to find a job after being out of work for too long. The recent numbers paint a frightening picture.
While quantifying just how much this will hurt our long-term productive capacity is a matter of guesswork, DeLong and Summers show that it doesn't have to be much to justify doing something now -- provided that rock bottom interest rates super-charge fiscal stimulus.
DeLong and Summers argue that real rates -- that is, adjusted for inflation -- don't have to be that low to make more spending a good deal. They calculate that real rates of anywhere between three and seven percent make fiscal stimulus worthwhile. Inflation-adjusted rates are negative now. But low rates don't only make borrowing cheaper. They might also make government spending more effective.
STIMULUS THAT WORKS: A BLACK SWAN, NOT A UNICORN
Government spending usually doesn't increase growth. Or, as economists put it, "the fiscal multiplier is usually close to zero." The multiplier just refers to how much total spending a dollar of government spending generates. For instance, if the government spends $1 billion and GDP goes up by $1.5 billion, then the multiplier would be 1.5. In normal times, the multiplier is zero, because the Federal Reserve offsets any additional spending. The Fed has its inflation target, and if more government spending pushes up inflation, then the Fed neutralizes it by raising interest rates. But with short-term rates hugging zero and inflation falling below target, this calculus might change. The Fed might allow the multiplier to be greater than one. And that would certainly make more spending a very good deal.
There are two broad objections to the notion that the fiscal multiplier might be quite high right now. First, just because short-term interest rates are at zero doesn't mean the Fed is out of ammunition. The Fed can still buy long-term bonds -- aka quantitative easing -- or tell markets that it will keep short-term rates low for an extended period. These things matter. If fiscal stimulus precludes the Fed from doing more monetary stimulus, then the apparent multiplier will be misleading. Second, it's hard to find many historical examples of a high fiscal multiplier. Critics like to point out that even during World War II -- when interest rates were also negligible -- that the multiplier was no better than during normal times. So, after all of this, does this mean that government spending isn't worth it?
Not so fast. Just because the Fed can use unconventional policy doesn't mean that fiscal stimulus is a waste. Much of the Fed's current strategy involves making (quasi) promises to keep rates low for a long time -- till late 2014, to be exact. It's a very watered down version of what Paul Krugman called "credibly promising to be irresponsible". The problem, though, is credibility. Markets might not believe the Fed. Actually, they don't. And that means that spending wouldn't be canceled out nearly as much right now. As for past instances of a high multiplier, World War II actually does offer solid evidence. You just need to know when to look. While we were actively fighting in the war, the government imposed private sector rationing. So it's hardly surprising that government spending didn't spur on private spending when the private sector was forbidden from spending. But here's an oft-forgotten fact: we started spending on the war long before we entered the war -- to help arm Great Britain. Those were our "arsenal of democracy" days. More importantly, there was no rationing from 1939-41. Over this period Robert Gordon and Robert Krenn found that the multiplier was as high as 1.8. That's really, really good.
The Cliff Notes version of all of this is that a fiscal multiplier greater than one is not a unicorn. It's more like a black swan. It exists. It's just rare. And this looks like one of those rare times. Taken together with our historically low rates, now seems like a great time to make some investments in ourselves. Putting the long-term unemployed back to work is an investment in their human capital. Refurbishing roads and bridges is an investment in the physical infrastructure we need to keep competing globally. Both make us better off in the long run, and could conceivably pay for themselves. Of course, none of the above means that the Fed can't or shouldn't try to do more. It's more of a practical appraisal about what the Fed will -- and won't -- do.
Usually comparing the government's budget to a family's budget is a bad idea. Governments can borrow for far longer and on far better terms. And, counterfeiters aside, families can't print money. But in this case it's a worthwhile comparison. A family struggling to make ends meet wouldn't be wise to save money by pulling their kids out of college if they can afford tuition. Similarly, governments running massive deficits during a depression wouldn't be wise to embrace austerity if markets will lend to them on favorable terms. In both cases, the long-term damage outweighs any short-term benefit.
Which is to say: When people offer you free money, don't say no.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
President-elect Donald Trump has committed a sharp breach of protocol—one that underscores just how weird some important protocols are.
Updated on December 2 at 7:49 p.m.
It’s hardly remembered now, having been overshadowed a few months later on September 11, but the George W. Bush administration’s first foreign-policy crisis came in the South China Sea. On April 1, 2001, a U.S. Navy surveillance plane collided with a Chinese jet near Hainan Island. The pilot of the Chinese jet was killed, and the American plane was forced to land and its crew was held hostage for 11 days, until a diplomatic agreement was worked out. Sino-American relations remained tense for some time.
Unlike Bush, Donald Trump didn’t need to wait to be inaugurated to set off a crisis in the relationship. He managed that on Friday, with a phone call to the president of Taiwan, Tsai Ing-wen. It’s a sharp breach with protocol, but it’s also just the sort that underscores how weird and incomprehensible some important protocols are.
A hotly contested, supposedly ancient manuscript suggests Christ was married. But believing its origin story—a real-life Da Vinci Code, involving a Harvard professor, a onetime Florida pornographer, and an escape from East Germany—requires a big leap of faith.
On a humid afternoon this past November, I pulled off Interstate 75 into a stretch of Florida pine forest tangled with runaway vines. My GPS was homing in on the house of a man I thought might hold the master key to one of the strangest scholarly mysteries in recent decades: a 1,300-year-old scrap of papyrus that bore the phrase “Jesus said to them, My wife.” The fragment, written in the ancient language of Coptic, had set off shock waves when an eminent Harvard historian of early Christianity, Karen L. King, presented it in September 2012 at a conference in Rome.
Never before had an ancient manuscript alluded to Jesus’s being married. The papyrus’s lines were incomplete, but they seemed to describe a dialogue between Jesus and the apostles over whether his “wife”—possibly Mary Magdalene—was “worthy” of discipleship. Its main point, King argued, was that “women who are wives and mothers can be Jesus’s disciples.” She thought the passage likely figured into ancient debates over whether “marriage or celibacy [was] the ideal mode of Christian life” and, ultimately, whether a person could be both sexual and holy.
The Daily Show host was measured, respectful, and challenging in his 26-minute conversation with TheBlaze pundit Tomi Lahren.
Tomi Lahren, the 24-year-old host of Tomi on the conservative cable network TheBlaze, feels like a pundit created by a computer algorithm, someone who primarily exists to say something provocative enough to jump to the top of a Facebook feed. She’s called the Black Lives Matter movement “the new KKK,” partly blamed the 2015 Chattanooga shootings on President Obama’s “Muslim sensitivity,” and declared Colin Kaepernick a “whiny, indulgent, attention-seeking cry-baby.” At a time when such charged political rhetoric feels increasingly like the norm, Lahren stands at one end of a widening gulf—which made her appearance on The Daily Show with Trevor Noah Wednesday night all the more fascinating.
In his first year at The Daily Show, Noah has struggled to distinguish himself in an outrage-driven late-night universe. He has sometimes seemed too flip about the failures of the country’s news media, something his predecessor Jon Stewart made a perennial target. Noah’s 26-minute conversation with Lahren, though, posted in its entirety online, set the kind of tone that Stewart frequently called for throughout his tenure. The segment never turned into a screaming match, but it also avoided platitudes and small-talk. Lahren was unapologetic about her online bombast and leaned into arguments that drew gasps and boos from Noah’s audience, but the host remained steadfastly evenhanded throughout. If Noah was looking for a specific episodethat would help him break out in his crowded field, he may have finally found it.
They say religious discrimination against Christians is as big a problem as discrimination against other groups.
Many, many Christians believe they are subject to religious discrimination in the United States. A new report from the Public Religion Research Institute and Brookings offers evidence: Almost half of Americans say discrimination against Christians is as big of a problem as discrimination against other groups, including blacks and minorities. Three-quarters of Republicans and Trump supporters said this, and so did nearly eight out of 10 white evangelical Protestants. Of the latter group, six in 10 believe that although America once was a Christian nation, it is no longer—a huge jump from 2012.
Polling data can be split up in a million different ways. It’s possible to sort by ethnicity, age, political party, and more. The benefit of sorting by religion, though, is that it highlights people’s beliefs: the way their ideological and spiritual convictions shape their self-understanding. This survey suggests that race is not enough to explain the sense of loss some white Americans seem to feel about their country, although it’s part of the story; the same is true of age, education level, and political affiliation. People’s beliefs seem to have a distinctive bearing on how they view changes in American culture, politics, and law—and whether they feel threatened. No group is more likely to express this fear than conservative Christians.
A single dose of magic mushrooms can make people with severe anxiety and depression better for months, according to a landmark pair of new studies.
The doom hung like an anvil over her head. In 2012, a few years after Carol Vincent was diagnosed with non-Hodgkin lymphoma, she was waiting to see whether her cancer would progress enough to require chemotherapy or radiation. The disease had already done a number on her, inflating lymph nodes on her chin, collar bones, and groin. She battled her symptoms while running her own marketing business. To top it all off, she was going through menopause.
“Life is just pointless stress, and then you die,” she thought. “All I’m doing is sitting here waiting for all this shit to happen.”
When one day at an intersection she mulled whether it would be so bad to get hit by a car, she realized her mental health was almost as depleted as her physical state.
Hallucinogens may help people break free of destructive thoughts and addiction. Can a “mystical experience” be had legally?
TOWSON, Maryland—Kathleen Conneally had smoked since she was 12, but one day in the spring of 2013, that changed in an instant. Conneally arrived at a lab in Baltimore that looked more like a cozy living room, with a cream-colored couch and paintings of mountains on the walls. She took a pill from a golden goblet and popped it in her mouth. Under the watch of a pair of trained guides, she began to see wild colors, shapes, and ideas. She began, for lack of a better term, to trip.
Conneally was a participant in an addiction study conducted by researchers at Johns Hopkins University, who wanted to determine whether the relentless pull of nicotine could be weakened by another drug: psilocybin—the active compound in magic mushrooms.
At the time of this writing, the Powerball jackpot is up to $1.5 billion. The cash grand prize is estimated at $930 million.
In a Powerball draw, five white balls are drawn from a drum with 69 balls and one red ball is drawn from a drum with 26 balls. If you match all six numbers, you win the jackpot. If you match only some of the numbers, you win a smaller fixed prize.
At $2 for each ticket, then, it would be possible to buy every possible ticket for $584,402,676. As a journalist, I don’t have that much money sitting around, but either a consortium of a few million Americans or a large and wealthy institution like a bank could conceivably assemble that level of cash.
A few weeks ago, I was trying to call Cuba. I got an error message—which, okay, international telephone codes are long and my fingers are clumsy—but the phone oddly started dialing again before I could hang up. A voice answered. It had a British accent and it was reading: “...the moon was shining brightly. The Martians had taken away the excavating-machine…”
Apparently, I had somehow called into an audiobook of The War of the Worlds. Suspicious of my clumsy fingers, I double-checked the number. It was correct (weird), but I tried the number again, figuring that at worst, I’d learn what happened after the Martians took away the excavating machine. This time, I got the initial error message and the call disconnected. No Martians.
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Editor’s note: An earlier version of this story presented an economic modeling assumption—the .01 chance of human extinction per year—as a vetted scholarly estimate. Following a correction from the Global Priorities Project, the text below has been updated.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. A new report from the U.K.-based Global Challenges Foundation urges us to take them seriously.
The nonprofit began its annual report on “global catastrophic risk” with a startling provocation: If figures often used to compute human extinction risk are correct, the average American is more than five times likelier to die during a human-extinction event than in a car crash.