Whether it's the 1930s or the 2010s, depressions are the only casualties in a currency war
I don't know how it compares to peeing in your bed, as one anonymous senior Fed official put it, but a currency war is one of the surest ways to end a global slump. Despite what you may have heard, it was a big part of what stopped the vicious circle of the Great Depression.
Currency wars are the best type of wars. Nobody dies, and everybody can recover, as long as everybody plays along. Here's how it works. One country devalues its currency -- in other words, prints money -- which, in a time of weak global demand, puts pressure on other countries to do the same, lest they lose out on trade. Then another country devalues, and so on, in a cascade of looser money. It's the invisible hand pushing for expansionary monetary policy when it's needed most.
But there are a few caveats. For one, a currency war only makes sense during a global depression when short-term interest rates are mostly stuck at zero. It's about boosting monetary stimulus when conventional methods are out of ammo. For another, devaluing forever (a là China) is not a sustainable growth strategy. It might make sense for developing nations to subsidize export industries early on, but, eventually, this will only cause imbalances to build up, while robbing the domestic population of purchasing power. And finally, there's a risk that a currency war could turn into a trade war. In other words, countries will retaliate to expansionary monetary policy not with expansionary monetary policy of their own, but with tariffs. Presumably that's what our silver-tongued senior Fed official was getting at with this head-scratcher of a quote:
Devaluing a currency is like peeing in bed. It feels good at first, but pretty soon it becomes a real mess.
This fear of a currency war begetting a trade war is certainly serious, but it's made to sound more serious thanks to some bad history. Here's the erroneous story you might have heard (especially now that Japan's talk of more aggressive easing has revived fears of a currency war):
After the Great Crash of 1929, countries abandoned the gold standard and devalued their currencies in a beggar-thy-neighbor battle to the bottom. This currency war turned into a trade war, with countries eventually resorting to tariffs and counter-tariffs, as they tried to grab a hold on an ever-shrinking pie of demand. The consequent collapse in world trade is what made the Great Depression so great, and set the stage for the trade war to turn into an actual one.
Scary stuff. But not quite true. The reality is the trade war started before the currency war, and the latter jump-started recovery wherever it was tried. The infamous Smoot-Hawley tariff in the U.S., the first salvo in the trade war to come, was actually passed in June 1930, more than a full year before any country devalued its currency. It wasn't until September 1931 that Britain abandoned the gold standard ... and that's when things get a bit complicated. It's hard to accuse Britain of "competitive" devaluation here, because it had no choice but devaluation; it had simply run out of gold. Nonetheless, other countries responded to Britain's increased competitiveness by increasing their trade barriers; in this case, the currency war, such as it was, did exacerbate the ongoing trade war, as Gavyn Davies of the Financial Times points out.
But then a funny thing happened. The punishment for Britain's economic weakness was a recovery. Ditching gold gave Britain (and everybody else who did so) the freedom to pursue more aggressive monetary and fiscal policies than the "rules of the game" of the gold standard had allowed.* As you can see in the chart below (via Brad DeLong) from Barry Eichengreen's magisterial work on the depression, Golden Fetters, recovery followed devaluation everywhere. There was no reward for financial orthodoxy in the 1930s. The countries that stayed with the gold standard the longest, the so-called Gold Bloc of France, Belgium, and Poland, were the last to begin growing again. In other words, the currency war didn't deepen the depression; it ended it.
And that brings us to one last, stupid question. How did beggar-thy-neighbor policies kickstart growth even after world trade had already collapsed? In other words, how did stealing a trade advantage help so much when there wasn't much trade to steal? Well, it's not entirely, or even mostly, about stealing trade. Indeed, as Scott Sumner points out, the U.S. trade balance actually worsened in 1933 after FDR took us off gold, even as the economy quickly reversed its death-spiral and began a virtuous cycle. It's easiest to frame devaluation as grabbing demand from abroad, but it's really about increasing demand at home. Devaluation means printing money, and more money during a liquidity trap means more demand, period. It also allows more stimulus spending than a fixed-exchange rate system (like the gold standard) would. The next time you hear someone lamenting the "destructive devaluations that followed the Great Depression," remember to ask them -- what was so destructive about ending the most destructive depression in modern history?
The only thing we have to fear is fear of currency wars itself. Depressions are the only casualties in these kind of conflicts.
* There were two exceptions. The gold standard did not constrain looser monetary policy in the U.S. and France in the early years of the depression, as both had more than enough gold to back more credit growth, but chose instead to sterilize their gold inflows out of fear of nonexistent inflation in the face of actual deflation. This stockpiling drained everybody else of gold, and consequently made staying on the gold standard impossible. Even the U.S. and France had to eventually abandon it to reverse years of deflation.
With Donald Trump its presumptive nominee after his win in the Indiana primary, the GOP will never be the same.
NEW YORK—Where were you the night Donald Trump killed the Republican Party as we knew it? Trump was right where he belonged: in the gilt-draped skyscraper with his name on it, Trump Tower in Manhattan, basking in the glory of his final, definitive victory.
“I have to tell you, I’ve competed all my life,” Trump said, his golden face somber, his gravity-defying pouf of hair seeming to hover above his brow. “All my life I’ve been in different competitions—in sports, or in business, or now, for 10 months, in politics. I have met some of the most incredible competitors that I’ve ever competed against right here in the Republican Party.”
The combined might of the Republican Party’s best and brightest—16 of them at the outset—proved, in the end, helpless against Trump’s unorthodox, muscular appeal to the party’s voting base. With his sweeping, 16-point victory in Tuesday’s Indiana primary, and the surrender of his major remaining rival, Ted Cruz, Trump was pronounced the presumptive nominee by the chair of the Republican National Committee. The primary was over—but for the GOP, the reckoning was only beginning.
Given her general election opponent, she has a historic opportunity to unite a grand, cross-party coalition.
The Republicans have made their choice. Now the Democrats’ likely nominee faces a dilemma of her own: Run as a centrist and try to pile up a huge majority—at risk of enraging Sanders voters? Or continue the left turn she’s executed through these primaries, preserve Democratic party unity—at the risk of pushing Trump-averse Republicans back to The Donald as the lesser evil?
The imminent Trump nomination threatens to rip the Republican party into three parts. Trump repels both the most conservative Republicans and the most moderate: both socially conservative regular church attenders and pro-Kasich affluent suburbanites, especially women. The most conservative Republicans won’t ever vote for Hillary Clinton of course. But they might be induced to stay home—if Clinton does not scare them into rallying to Trump. The most moderate Republicans might well cast a cross party line vote—if Clinton can convince them that she’s the more responsible steward and manager.
The odds of defeating the billionaire depend in part on whether Americans who oppose him do what’s effective—or what feels emotionally satisfying.
Tens of millions of Americans want to deny Donald Trump the presidency. How best to do it? Many who oppose the billionaire will be tempted to echo Bret Stephens: “If by now you don’t find Donald Trump appalling,” the Wall Street Journal columnist told the Republican frontrunner’s supporters, “you’re appalling.”
Some will be tempted to respond like anti-Trump protesters in Costa Mesa, California. Violent elements in that crowd threw rocks at a passing pickup truck, smashed the window of a police cruiser, and bloodied at least one Trump supporter. Others in the crowd waved Mexican flags. “I knew this was going to happen,” a 19-year-old told the L.A. Times. “It was going to be a riot. He deserves what he gets.”
By handcuffing a new seriesto its online-only service, the network is trying to catch the next wave of the television industry.
What’s the easiest way to tell that we’re in the midst of a television programming revolution? Just look at what the networks, the dinosaurs of the industry, are doing to keep up. On Tuesday, CBS detailed its plans for its prospective Netflix competitor “CBS All Access,” a monthly subscription-based online service that will use a new Star Trek show to try and reel in viewers. But where Netflix’s strategy is to become a vast repository of original content, dumping whole seasons of original shows at a time for people to sample at their leisure, CBS is trying to hold onto the weekly model that has defined broadcast strategy for decades. That compromise is currently untested, but it could be the future of the medium.
What jargon says about armies, and the societies they serve
JERUSALEM—“We have two flowers and one oleander. We need a thistle.” Listening to the Israeli military frequencies when I was an infantryman nearly two decades ago, it was (and still is) possible to hear sentences like these, the bewildering cousins of sentences familiar to anyone following America’s present-day wars. “Vegas is in a TIC,” says a U.S. infantryman in Afghanistan in Sebastian Junger’s book War. What does it all mean?
Anyone seeking to understand the world needs to understand soldiers, but the language of soldiers tends to be bizarre and opaque, an apt symbol for the impossibility of communicating their experiences to people safe at home. The language isn’t nonsense—it means something to the soldiers, of course, but it also has something to say about the army and society to which they belong, and about the shared experience of military service anywhere. The soldiers’ vernacular must provide words for things that civilians don’t need to describe, like grades of officers and kinds of weapons. But it has deeper purposes too.
A new study suggests teens who vow to be sexually abstinent until marriage—and then break that vow—are more likely to wind up pregnant than those who never took the pledge to begin with.
Teen birth and pregnancy rates have been in a free fall, and there are a few commonly held explanations why. One is that more teens are using the morning-after pill and long-acting reversible contraceptives, or LARCs. The economy might have played a role, since the decline in teen births accelerated during the the recession. Finally, only 44 percent of unmarried teen girls now say they’ve had sex, down from 51 percent in 1988.
Teens are having less sex, and that’s good news for pregnancy-and STD-prevention. But paradoxically, while it’s good for teens not to have sex, new research suggests it might be bad for them to promise not to.
As of 2002, about one in eight teens, or 12 percent, pledged to be sexually abstinent until marriage. Some studies have found that taking virginity pledges does indeed lead teens to delay sex and have fewer overall sex partners. But since just 3 percent of Americans wait until marriage to have sex, the majority of these “pledge takers” become “pledge breakers,” as Anthony Paik, a sociologist at the University of Massachusetts–Amherst, explains in his new study, which was published in the Journal of Marriage and Family.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
A new study shows that we burn many more daily calories than other apes.
Evolution works on a strict energy budget. Each adaptation burns through a certain number of calories, and each individual can only acquire so many calories in the course of a day. You can’t have flapping wings and a huge body and venom and fast legs and a big brain. If you want to expand some departments, you need to make cuts in others. That’s why, for example, animals that reproduce faster tend to die earlier. They divert energy towards making new bodies, and away from maintaining their own.
But humans, on the face of it, are exceptional. Compared to other apes, we reproduce more often (or, at least, those of us in traditional societies do) and our babies are bigger when they’re born and we live longer. And, as if to show off, our brains are much larger, and these huge organs sap some 20 percent of our total energy.
Rampant drug use in Austin, Indiana—coupled with unemployment and poor living conditions—brought on a public-health crisis that some are calling a “syndemic.”
Jessica and Darren McIntosh were too busy to see me when I arrived at their house one Sunday morning. When I returned later, I learned what they’d been busy with: arguing with a family member, also an addict, about a single pill of prescription painkiller she’d lost, and injecting meth to get by in its absence. Jessica, 30, and Darren, 24, were children when they started using drugs. Darren smoked his first joint when he was 12 and quickly moved on to snorting pills. “By the time I was 13, I was a full-blown pill addict, and I have been ever since,” he said. By age 14, he’d quit school. When I asked where his caregivers were when he started using drugs, he laughed. “They’re the ones that was giving them to me,” he alleged. “They’re pill addicts, too.”
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.