The fiscally conservative case to borrow and spend -- and feel good about it
What if borrowing money made you so much richer over the long-term that it paid for itself? It's not crazy. Millions of families make such a decision every year when they take on debt to pay for school. Indeed, investing in yourself is a bet that often pays off. But can the same be true for an entire country?
Brad DeLong and Larry Summers say yes. In a provocative new paper, they argue that when the economy is depressed like today, government spending can be a free lunch. It can pay for itself.
It's a fairly simple story. With interest rates at zero, the normal rules do not apply. Government spending can put people back to work and prevent the long-term unemployed from becoming unemployable. This last point is critical. If people are out of work for too long, they lose skills, which makes employers less likely to hire them, which makes them lose even more skills, and so on, and so on. Even when the economy fully recovers, these workers will stay on the sidelines. It's not just these workers who suffer from being out of work. We all do. High unemployment is a symptom of a collapse in investment. If we don't make needed investments now, that will put a brake on growth down the line. Together, economists call these twin menaces hysteresis. And if it sets in, it reduces how much we can do and make in the future. Assuming that spending now can forestall hysteresis, then this spending might be self-financing. In other words, spending now might "cost" us less than not acting.
This doesn't mean that government spending is magic. Often, it's anything but. But this is a special case. DeLong and Summers identify three factors that determine whether fiscal stimulus will pay for itself: 1) how much hysteresis hurts future output, 2) the inflation-adjusted interest rate, and 3) the size of the fiscal multiplier. Let's consider these in turn.
THE MONSTER OF HYSTERESIS
Economists know a lot about a lot of things. Hysteresis is not one of them.
Indeed, it's not clear whether long-term unemployment and investment shortfalls really do damage potential growth over the really long-term. Maybe hysteresis "only" wounds us for the next 20 years, but not the next 40 years. Unfortunately, there's reason to fear that this is optimistic. A recent paper by Stephen Davis and Till von Wachter finds that workers who are laid off during recessions -- who presumably take longer to find a new job -- take worse hits to their lifetime earnings than do workers who are laid off during good times. Lasting unemployment has lasting consequences. That should terrify our policymakers.
The below chart from DeLong and Summers shows the unemployment rate versus the percentage of working-age people who are actually working. Any divergence between the two shows us how many people have given up on trying to find a job after being out of work for too long. The recent numbers paint a frightening picture.
While quantifying just how much this will hurt our long-term productive capacity is a matter of guesswork, DeLong and Summers show that it doesn't have to be much to justify doing something now -- provided that rock bottom interest rates super-charge fiscal stimulus.
DeLong and Summers argue that real rates -- that is, adjusted for inflation -- don't have to be that low to make more spending a good deal. They calculate that real rates of anywhere between three and seven percent make fiscal stimulus worthwhile. Inflation-adjusted rates are negative now. But low rates don't only make borrowing cheaper. They might also make government spending more effective.
STIMULUS THAT WORKS: A BLACK SWAN, NOT A UNICORN
Government spending usually doesn't increase growth. Or, as economists put it, "the fiscal multiplier is usually close to zero." The multiplier just refers to how much total spending a dollar of government spending generates. For instance, if the government spends $1 billion and GDP goes up by $1.5 billion, then the multiplier would be 1.5. In normal times, the multiplier is zero, because the Federal Reserve offsets any additional spending. The Fed has its inflation target, and if more government spending pushes up inflation, then the Fed neutralizes it by raising interest rates. But with short-term rates hugging zero and inflation falling below target, this calculus might change. The Fed might allow the multiplier to be greater than one. And that would certainly make more spending a very good deal.
There are two broad objections to the notion that the fiscal multiplier might be quite high right now. First, just because short-term interest rates are at zero doesn't mean the Fed is out of ammunition. The Fed can still buy long-term bonds -- aka quantitative easing -- or tell markets that it will keep short-term rates low for an extended period. These things matter. If fiscal stimulus precludes the Fed from doing more monetary stimulus, then the apparent multiplier will be misleading. Second, it's hard to find many historical examples of a high fiscal multiplier. Critics like to point out that even during World War II -- when interest rates were also negligible -- that the multiplier was no better than during normal times. So, after all of this, does this mean that government spending isn't worth it?
Not so fast. Just because the Fed can use unconventional policy doesn't mean that fiscal stimulus is a waste. Much of the Fed's current strategy involves making (quasi) promises to keep rates low for a long time -- till late 2014, to be exact. It's a very watered down version of what Paul Krugman called "credibly promising to be irresponsible". The problem, though, is credibility. Markets might not believe the Fed. Actually, they don't. And that means that spending wouldn't be canceled out nearly as much right now. As for past instances of a high multiplier, World War II actually does offer solid evidence. You just need to know when to look. While we were actively fighting in the war, the government imposed private sector rationing. So it's hardly surprising that government spending didn't spur on private spending when the private sector was forbidden from spending. But here's an oft-forgotten fact: we started spending on the war long before we entered the war -- to help arm Great Britain. Those were our "arsenal of democracy" days. More importantly, there was no rationing from 1939-41. Over this period Robert Gordon and Robert Krenn found that the multiplier was as high as 1.8. That's really, really good.
The Cliff Notes version of all of this is that a fiscal multiplier greater than one is not a unicorn. It's more like a black swan. It exists. It's just rare. And this looks like one of those rare times. Taken together with our historically low rates, now seems like a great time to make some investments in ourselves. Putting the long-term unemployed back to work is an investment in their human capital. Refurbishing roads and bridges is an investment in the physical infrastructure we need to keep competing globally. Both make us better off in the long run, and could conceivably pay for themselves. Of course, none of the above means that the Fed can't or shouldn't try to do more. It's more of a practical appraisal about what the Fed will -- and won't -- do.
Usually comparing the government's budget to a family's budget is a bad idea. Governments can borrow for far longer and on far better terms. And, counterfeiters aside, families can't print money. But in this case it's a worthwhile comparison. A family struggling to make ends meet wouldn't be wise to save money by pulling their kids out of college if they can afford tuition. Similarly, governments running massive deficits during a depression wouldn't be wise to embrace austerity if markets will lend to them on favorable terms. In both cases, the long-term damage outweighs any short-term benefit.
Which is to say: When people offer you free money, don't say no.
Defining common cultural literacy for an increasingly diverse nation.
Is the culture war over?
That seems an absurd question. This is an age when Confederate monuments still stand; when white-privilege denialism is surging on social media; when legislators and educators in Arizona and Texas propose banning ethnic studies in public schools and assign textbooks euphemizing the slave trade; when fear of Hispanic and Asian immigrants remains strong enough to prevent immigration reform in Congress; when the simple assertion that #BlackLivesMatter cannot be accepted by all but is instead contested petulantly by many non-blacks as divisive, even discriminatory.
And that’s looking only at race. Add gender, guns, gays, and God to the mix and the culture war seems to be raging along quite nicely.
In 1992, the neuroscientist Richard Davidson got a challenge from the Dalai Lama. By that point, he’d spent his career asking why people respond to, in his words, “life’s slings and arrows” in different ways. Why are some people more resilient than others in the face of tragedy? And is resilience something you can gain through practice?
The Dalai Lama had a different question for Davidson when he visited the Tibetan Buddhist spiritual leader at his residence in Dharamsala, India. “He said: ‘You’ve been using the tools of modern neuroscience to study depression, and anxiety, and fear. Why can’t you use those same tools to study kindness and compassion?’ … I did not have a very good answer. I said it was hard.”
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
A new book by the evolutionary biologist Jerry Coyne tackles arguments that the two institutions are compatible.
In May 1988, a 13-year-old girl named Ashley King was admitted to Phoenix Children’s Hospital by court order. She had a tumor on her leg—an osteogenic sarcoma—that, writes Jerry Coyne in his book Faith Versus Fact, was “larger than a basketball,” and was causing her leg to decay while her body started to shut down. Ashley’s Christian Scientist parents, however, refused to allow doctors permission to amputate, and instead moved their daughter to a Christian Science sanatorium, where, in accordance with the tenets of their faith, “there was no medical care, not even pain medication.” Ashley’s mother and father arranged a collective pray-in to help her recover—to no avail. Three weeks later, she died.
Former Senator Jim Webb is the fifth Democrat to enter the race—and by far the most conservative one.
In a different era’s Democratic Party, Jim Webb might be a serious contender for the presidential nomination. He’s a war hero and former Navy secretary, but he has been an outspoken opponent of recent military interventions. He’s a former senator from Virginia, a purple state. He has a strong populist streak, could appeal to working-class white voters, and might even have crossover appeal from his days as a member of the Reagan administration.
In today’s leftward drifting Democratic Party, however, it’s hard to see Webb—who declared his candidacy Thursday—getting very far. As surprising as Bernie Sanders’s rise in the polls has been, he looks more like the Democratic base than Webb does. The Virginian is progressive on a few major issues, including the military and campaign spending, but he’s far to the center or even right on others: He's against affirmative action, supports gun rights, and is a defender of coal. During the George W. Bush administration, Democrats loved to have him as a foil to the White House. It’s hard to imagine the national electorate will cotton to him in the same way. Webb’s statement essentially saying he had no problem with the Confederate battle flag flying in places like the grounds of the South Carolina capitol may have been the final straw. (At 69, he’s also older than Hillary Clinton, whose age has been a topic of debate, though still younger than Bernie Sanders or Joe Biden.)
The Fourth of July—a time we Americans set aside to celebrate our independence and mark the war we waged to achieve it, along with the battles that followed. There was the War of 1812, the War of 1833, the First Ohio-Virginia War, the Three States' War, the First Black Insurrection, the Great War, the Second Black Insurrection, the Atlantic War, the Florida Intervention.
Confused? These are actually conflicts invented for the novel The Disunited States of Americaby Harry Turtledove, a prolific (and sometimes-pseudonymous) author of alternate histories with a Ph.D. in Byzantine history. The book is set in the 2090s in an alternate United States that is far from united. In fact, the states, having failed to ratify a constitution following the American Revolution, are separate countries that oscillate between cooperating and warring with one another, as in Europe.
The executive producer of Masterpiece says Jane Austen works a lot better on screen than Hemingway does.
For 44 years, PBS’s Masterpiece franchise has brought high-end Britain TV programs to American audiences. While the ultra-successful Downton Abbey comes from an original screenplay, many of Masterpiece’s shows over the years have been adapted from great works of literature. And the vast majority of those great works of literature, unsurprisingly, have been British.
But every so often, an American novel—like James Agee’s A Death in the Family or Willa Cather’s The Song of the Lark—has been turned into a Masterpiece. On Friday at the Aspen Ideas Festival, Rebecca Eaton, the longtime executive producer of Masterpiece, said she wished that the program had tackled more U.S. authors over the years. “The reasons that we haven't are twofold,” she said. “One is money, the second is money. And the third is money. Also, the dark nature of American literature, which is something to think about for a moment."
The meaning of the Confederate flag is best discerned in the words of those who bore it.
This afternoon, in announcing her support for removing the Confederate flag from the capitol grounds, South Carolina Governor Nikki Haley asserted that killer Dylann Roof had “a sick and twisted view of the flag” which did not reflect “the people in our state who respect and in many ways revere it.” If the governor meant that very few of the flag’s supporters believe in mass murder, she is surely right. But on the question of whose view of the Confederate Flag is more twisted, she is almost certainly wrong.
Roof’s belief that black life had no purpose beyond subjugation is “sick and twisted” in the exact same manner as the beliefs of those who created the Confederate flag were “sick and twisted.” The Confederate flag is directly tied to the Confederate cause, and the Confederate cause was white supremacy. This claim is not the result of revisionism. It does not require reading between the lines. It is the plain meaning of the words of those who bore the Confederate flag across history. These words must never be forgotten. Over the next few months the word “heritage” will be repeatedly invoked. It would be derelict to not examine the exact contents of that heritage.
How a re-creation of its most famous battle helped erase the meaning of the Civil War.
"No person should die without seeing this cyclorama," declared a Boston man in 1885. "It's a duty they owe to their country." Paul Philippoteaux's lifelike depiction of the Battle of Gettysburg was much more than a painting. It re-created the battlefield with such painstaking fidelity, and created an illusion so enveloping, that many visitors felt as if they were actually there.
For all its verisimilitude, though, the painting failed to capture the deeper truths of the Civil War. It showed the two armies in lavish detail, but not the clash of ideals that impelled them onto the battlefield. Its stunning rendition of a battle utterly divorced from context appealed to a nation as eager to remember the valor of those who fought as it was to forget the purpose of their fight. Its version of the conflict proved so alluring, in fact, that it changed the way America remembered the Civil War.
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.