Whether it's the 1930s or the 2010s, depressions are the only casualties in a currency war
I don't know how it compares to peeing in your bed, as one anonymous senior Fed official put it, but a currency war is one of the surest ways to end a global slump. Despite what you may have heard, it was a big part of what stopped the vicious circle of the Great Depression.
Currency wars are the best type of wars. Nobody dies, and everybody can recover, as long as everybody plays along. Here's how it works. One country devalues its currency -- in other words, prints money -- which, in a time of weak global demand, puts pressure on other countries to do the same, lest they lose out on trade. Then another country devalues, and so on, in a cascade of looser money. It's the invisible hand pushing for expansionary monetary policy when it's needed most.
But there are a few caveats. For one, a currency war only makes sense during a global depression when short-term interest rates are mostly stuck at zero. It's about boosting monetary stimulus when conventional methods are out of ammo. For another, devaluing forever (a là China) is not a sustainable growth strategy. It might make sense for developing nations to subsidize export industries early on, but, eventually, this will only cause imbalances to build up, while robbing the domestic population of purchasing power. And finally, there's a risk that a currency war could turn into a trade war. In other words, countries will retaliate to expansionary monetary policy not with expansionary monetary policy of their own, but with tariffs. Presumably that's what our silver-tongued senior Fed official was getting at with this head-scratcher of a quote:
Devaluing a currency is like peeing in bed. It feels good at first, but pretty soon it becomes a real mess.
This fear of a currency war begetting a trade war is certainly serious, but it's made to sound more serious thanks to some bad history. Here's the erroneous story you might have heard (especially now that Japan's talk of more aggressive easing has revived fears of a currency war):
After the Great Crash of 1929, countries abandoned the gold standard and devalued their currencies in a beggar-thy-neighbor battle to the bottom. This currency war turned into a trade war, with countries eventually resorting to tariffs and counter-tariffs, as they tried to grab a hold on an ever-shrinking pie of demand. The consequent collapse in world trade is what made the Great Depression so great, and set the stage for the trade war to turn into an actual one.
Scary stuff. But not quite true. The reality is the trade war started before the currency war, and the latter jump-started recovery wherever it was tried. The infamous Smoot-Hawley tariff in the U.S., the first salvo in the trade war to come, was actually passed in June 1930, more than a full year before any country devalued its currency. It wasn't until September 1931 that Britain abandoned the gold standard ... and that's when things get a bit complicated. It's hard to accuse Britain of "competitive" devaluation here, because it had no choice but devaluation; it had simply run out of gold. Nonetheless, other countries responded to Britain's increased competitiveness by increasing their trade barriers; in this case, the currency war, such as it was, did exacerbate the ongoing trade war, as Gavyn Davies of the Financial Times points out.
But then a funny thing happened. The punishment for Britain's economic weakness was a recovery. Ditching gold gave Britain (and everybody else who did so) the freedom to pursue more aggressive monetary and fiscal policies than the "rules of the game" of the gold standard had allowed.* As you can see in the chart below (via Brad DeLong) from Barry Eichengreen's magisterial work on the depression, Golden Fetters, recovery followed devaluation everywhere. There was no reward for financial orthodoxy in the 1930s. The countries that stayed with the gold standard the longest, the so-called Gold Bloc of France, Belgium, and Poland, were the last to begin growing again. In other words, the currency war didn't deepen the depression; it ended it.
And that brings us to one last, stupid question. How did beggar-thy-neighbor policies kickstart growth even after world trade had already collapsed? In other words, how did stealing a trade advantage help so much when there wasn't much trade to steal? Well, it's not entirely, or even mostly, about stealing trade. Indeed, as Scott Sumner points out, the U.S. trade balance actually worsened in 1933 after FDR took us off gold, even as the economy quickly reversed its death-spiral and began a virtuous cycle. It's easiest to frame devaluation as grabbing demand from abroad, but it's really about increasing demand at home. Devaluation means printing money, and more money during a liquidity trap means more demand, period. It also allows more stimulus spending than a fixed-exchange rate system (like the gold standard) would. The next time you hear someone lamenting the "destructive devaluations that followed the Great Depression," remember to ask them -- what was so destructive about ending the most destructive depression in modern history?
The only thing we have to fear is fear of currency wars itself. Depressions are the only casualties in these kind of conflicts.
* There were two exceptions. The gold standard did not constrain looser monetary policy in the U.S. and France in the early years of the depression, as both had more than enough gold to back more credit growth, but chose instead to sterilize their gold inflows out of fear of nonexistent inflation in the face of actual deflation. This stockpiling drained everybody else of gold, and consequently made staying on the gold standard impossible. Even the U.S. and France had to eventually abandon it to reverse years of deflation.
A Hillary Clinton presidential victory promises to usher in a new age of public misogyny.
Get ready for the era of The Bitch.
If Hillary Clinton wins the White House in November, it will be a historic moment, the smashing of the preeminent glass ceiling in American public life. A mere 240 years after this nation’s founding, a woman will occupy its top office. America’s daughters will at last have living, breathing, pantsuit-wearing proof that they too can grow up to be president.
A Clinton victory also promises to usher in four-to-eight years of the kind of down-and-dirty public misogyny you might expect from a stag party at Roger Ailes’s house.
You know it’s coming. As hyperpartisanship, grievance politics, and garden-variety rage shift from America’s first black commander-in-chief onto its first female one, so too will the focus of political bigotry. Some of it will be driven by genuine gender grievance or discomfort among some at being led by a woman. But in plenty of other cases, slamming Hillary as a bitch, a c**t (Thanks, Scott Baio!), or a menopausal nut-job (an enduringly popular theme on Twitter) will simply be an easy-peasy shortcut for dismissing her and delegitimizing her presidency.
The San Francisco quarterback has been attacked for refusing to stand for the Star Spangled Banner—and for daring to criticize the system in which he thrived.
It was in early childhood when W.E.B. Du Bois––scholar, activist, and black radical––first noticed The Veil that separated him from his white classmates in the mostly white town of Great Barrington, Massachusetts. He and his classmates were exchanging “visiting cards,” invitations to visit one another’s homes, when a white girl refused his.
“Then it dawned upon me with a certain suddenness that I was different from the others; or like, mayhap, in heart and life and longing, but shut out from their world by a vast veil. I had thereafter no desire to tear down that veil, to creep through; I held all beyond it in common contempt, and lived above it in a region of blue sky and great wandering shadows,” Du Bois wrote in his acclaimed essay collection, The Souls of Black Folk. “That sky was bluest when I could beat my mates at examination-time, or beat them at a foot-race, or even beat their stringy heads.”
The talk-radio host claims that he never took Donald Trump seriously on immigration. He neglected to tell his immigration obsessed listeners.
For almost a decade, I’ve been angrily documenting the way that many right-wing talk-radio hosts betray the rank-and-file conservatives who trust them for information. My late grandmother was one of those people. She deserved better than she got. With huge platforms and massive audiences, successful hosts ought to take more care than the average person to be truthful and avoid misinforming listeners. Yet they are egregiously careless on some days and willfully misleading on others.
And that matters, as we’ll come to see.
Rush Limbaugh is easily the most consequential of these hosts. He has an audience of millions. And over the years, parts of the conservative movement that ought to know better, like the Claremont Institute, have treated him like an honorable conservative intellectual rather than an intellectually dishonest entertainer. The full cost of doing so became evident this year, when a faction of populists shaped by years of talk radio, Fox News, and Breitbart.com picked Donald Trump to lead the Republican Party, a choice that makes a Hillary Clinton victory likely and is a catastrophe for movement conservatism regardless of who wins.
Which is a different way of asking: Can a bot commit libel?
Facebook set a new land-speed record for situational irony this week, as it fired the people who kept up its “Trending Topics” feature and replaced them with an algorithm on Friday, only to find the algorithm promoting completely fake news on Sunday.
Rarely in recent tech history has a downsizing decision come back to bite the company so publicly and so quickly.
Practices meant to protect marginalized communities can also ostracize those who disagree with them.
Last week, the University of Chicago’s dean of students sent a welcome letter to freshmen decrying trigger warnings and safe spaces—ways for students to be warned about and opt out of exposure to potentially challenging material. While some supported the school’s actions, arguing that these practices threaten free speech and the purpose of higher education, the note also led to widespread outrage, and understandably so. Considered in isolation, trigger warnings may seem straightforwardly good. Basic human decency means professors like myself should be aware of students’ traumatic experiences, and give them a heads up about course content—photographs of dead bodies, extended accounts of abuse, disordered eating, self-harm—that might trigger an anxiety attack and foreclose intellectual engagement. Similarly, it may seem silly to object to the creation of safe spaces on campus, where members of marginalized groups can count on meeting supportive conversation partners who empathize with their life experiences, and where they feel free to be themselves without the threat of judgment or censure.
Like a little white Lazarus with red eyes, the paralyzed mouse was walking again.
A few days earlier, the mouse had been sprawled on an operating table while two Chinese graduate students peered through a microscope and operated on its spine. With a tiny pair of scissors, they removed the top half of a fingernail-thin vertebra, exposing a gleaming patch of spinal-cord tissue. It looked like a Rothko, a clean ivory rectangle bisected by a red line. Cautiously—the mouse occasionally twitched—they snipped the red line (an artery) and tied it off. Then one student reached for a $1,000 scalpel with a diamond blade so thin that it was transparent. With a quick slice of the spinal cord, the mouse’s back legs were rendered forever useless.
His latest ugly truth came during a Bloomberg TV interview last Friday, when he said George W. Bush deserves responsibility for the fact that “the World Trade Center came down during his time.” Politicians and journalists erupted in indignation. Jeb Bush called Trump’s comments “pathetic.” Ben Carson dubbed them “ridiculous.”
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
A new anatomical understanding of how movement controls the body’s stress response system
Elite tennis players have an uncanny ability to clear their heads after making errors. They constantly move on and start fresh for the next point. They can’t afford to dwell on mistakes.
Peter Strick is not a professional tennis player. He’s a distinguished professor and chair of the department of neurobiology at the University of Pittsburgh Brain Institute. He’s the sort of person to dwell on mistakes, however small.
“My kids would tell me, dad, you ought to take up pilates. Do some yoga,” he said. “But I’d say, as far as I’m concerned, there's no scientific evidence that this is going to help me.”
Still, the meticulous skeptic espoused more of a tennis approach to dealing with stressful situations: Just teach yourself to move on. Of course there is evidence that ties practicing yoga to good health, but not the sort that convinced Strick. Studies show correlations between the two, but he needed a physiological mechanism to explain the relationship. Vague conjecture that yoga “decreases stress” wasn’t sufficient. How? Simply by distracting the mind?
To become a citizen of the United States, naturalizing immigrants must take a test. Many native-born Americans would fail this test. Indeed, most of us have never really thought about what it means to be a citizen. One radical idea from the immigration debate is the repeal of birthright citizenship—guaranteed by the Fourteenth Amendment—to prevent so-called anchor babies. Odious and constitutionally dubious as this proposal may be, it does prompt a thought experiment: What if citizenship were not, in fact, guaranteed by birth? What if everyone had to earn it upon turning 18, and renew it every 10 years, by taking an exam? What might that exam look like?