Whether it's the 1930s or the 2010s, depressions are the only casualties in a currency war
I don't know how it compares to peeing in your bed, as one anonymous senior Fed official put it, but a currency war is one of the surest ways to end a global slump. Despite what you may have heard, it was a big part of what stopped the vicious circle of the Great Depression.
Currency wars are the best type of wars. Nobody dies, and everybody can recover, as long as everybody plays along. Here's how it works. One country devalues its currency -- in other words, prints money -- which, in a time of weak global demand, puts pressure on other countries to do the same, lest they lose out on trade. Then another country devalues, and so on, in a cascade of looser money. It's the invisible hand pushing for expansionary monetary policy when it's needed most.
But there are a few caveats. For one, a currency war only makes sense during a global depression when short-term interest rates are mostly stuck at zero. It's about boosting monetary stimulus when conventional methods are out of ammo. For another, devaluing forever (a là China) is not a sustainable growth strategy. It might make sense for developing nations to subsidize export industries early on, but, eventually, this will only cause imbalances to build up, while robbing the domestic population of purchasing power. And finally, there's a risk that a currency war could turn into a trade war. In other words, countries will retaliate to expansionary monetary policy not with expansionary monetary policy of their own, but with tariffs. Presumably that's what our silver-tongued senior Fed official was getting at with this head-scratcher of a quote:
Devaluing a currency is like peeing in bed. It feels good at first, but pretty soon it becomes a real mess.
This fear of a currency war begetting a trade war is certainly serious, but it's made to sound more serious thanks to some bad history. Here's the erroneous story you might have heard (especially now that Japan's talk of more aggressive easing has revived fears of a currency war):
After the Great Crash of 1929, countries abandoned the gold standard and devalued their currencies in a beggar-thy-neighbor battle to the bottom. This currency war turned into a trade war, with countries eventually resorting to tariffs and counter-tariffs, as they tried to grab a hold on an ever-shrinking pie of demand. The consequent collapse in world trade is what made the Great Depression so great, and set the stage for the trade war to turn into an actual one.
Scary stuff. But not quite true. The reality is the trade war started before the currency war, and the latter jump-started recovery wherever it was tried. The infamous Smoot-Hawley tariff in the U.S., the first salvo in the trade war to come, was actually passed in June 1930, more than a full year before any country devalued its currency. It wasn't until September 1931 that Britain abandoned the gold standard ... and that's when things get a bit complicated. It's hard to accuse Britain of "competitive" devaluation here, because it had no choice but devaluation; it had simply run out of gold. Nonetheless, other countries responded to Britain's increased competitiveness by increasing their trade barriers; in this case, the currency war, such as it was, did exacerbate the ongoing trade war, as Gavyn Davies of the Financial Times points out.
But then a funny thing happened. The punishment for Britain's economic weakness was a recovery. Ditching gold gave Britain (and everybody else who did so) the freedom to pursue more aggressive monetary and fiscal policies than the "rules of the game" of the gold standard had allowed.* As you can see in the chart below (via Brad DeLong) from Barry Eichengreen's magisterial work on the depression, Golden Fetters, recovery followed devaluation everywhere. There was no reward for financial orthodoxy in the 1930s. The countries that stayed with the gold standard the longest, the so-called Gold Bloc of France, Belgium, and Poland, were the last to begin growing again. In other words, the currency war didn't deepen the depression; it ended it.
And that brings us to one last, stupid question. How did beggar-thy-neighbor policies kickstart growth even after world trade had already collapsed? In other words, how did stealing a trade advantage help so much when there wasn't much trade to steal? Well, it's not entirely, or even mostly, about stealing trade. Indeed, as Scott Sumner points out, the U.S. trade balance actually worsened in 1933 after FDR took us off gold, even as the economy quickly reversed its death-spiral and began a virtuous cycle. It's easiest to frame devaluation as grabbing demand from abroad, but it's really about increasing demand at home. Devaluation means printing money, and more money during a liquidity trap means more demand, period. It also allows more stimulus spending than a fixed-exchange rate system (like the gold standard) would. The next time you hear someone lamenting the "destructive devaluations that followed the Great Depression," remember to ask them -- what was so destructive about ending the most destructive depression in modern history?
The only thing we have to fear is fear of currency wars itself. Depressions are the only casualties in these kind of conflicts.
* There were two exceptions. The gold standard did not constrain looser monetary policy in the U.S. and France in the early years of the depression, as both had more than enough gold to back more credit growth, but chose instead to sterilize their gold inflows out of fear of nonexistent inflation in the face of actual deflation. This stockpiling drained everybody else of gold, and consequently made staying on the gold standard impossible. Even the U.S. and France had to eventually abandon it to reverse years of deflation.
For those who didn't go to prestigious schools, don't come from money, and aren't interested in sports and booze—it's near impossible to gain access to the best paying jobs.
As income inequality in the U.S. strikes historic highs, many people are starting to feel that the American dream is either dead or out of reach. Only 64 percent of Americans still believe that it’s possible to go from rags to riches, and, in another poll, 63 percent said they did not believe their children would be better off than they were. These days, the idea that anyone who works hard can become wealthy is at best a tough sell.
Along with the Nancy Drew series, almost all of the thrillers in the popular teenage franchise were produced by ghostwriters, thanks to a business model that proved to be prescient.
In the opening pages of a recent installment of the children’s book series The Hardy Boys, black smoke drifts though the ruined suburb of Bayport. The town's residents, dressed in tatters and smeared with ash, stumble past the local pharmacy and diner. Shards of glass litter the sidewalk. “Unreal,” says the mystery-solving teenager Joe Hardy—and he's right. Joe and his brother Frank are on a film set, and the people staggering through the scene are actors dressed as zombies. But as is always the case with Hardy Boysbooks, something still isn’t quite right: This time, malfunctioning sets nearly kill several actors, and the brothers find themselves in the middle of yet another mystery.
There are two types of people in the world: those with hundreds of unread messages, and those who can’t relax until their inboxes are cleared out.
For some, it’s a spider. For others, it’s an unexpected run-in with an ex. But for me, discomfort is a dot with a number in it: 1,328 unread-message notifications? I just can’t fathom how anyone lives like that.
How is it that some people remain calm as unread messages trickle into their inboxes and then roost there unattended, while others can’t sit still knowing that there are bolded-black emails and red-dotted Slack messages? I may operate toward the extreme end of compulsive notification-eliminators, but surveys suggest I’m not alone: One 2012 study found that 70 percent of work emails were attended to within six seconds of their arrival.
This has led me to a theory that there are two types of emailers in the world: Those who can comfortably ignore unread notifications, and those who feel the need to take action immediately.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
New research confirms what they say about nice guys.
Smile at the customer. Bake cookies for your colleagues. Sing your subordinates’ praises. Share credit. Listen. Empathize. Don’t drive the last dollar out of a deal. Leave the last doughnut for someone else.
Sneer at the customer. Keep your colleagues on edge. Claim credit. Speak first. Put your feet on the table. Withhold approval. Instill fear. Interrupt. Ask for more. And by all means, take that last doughnut. You deserve it.
Follow one of those paths, the success literature tells us, and you’ll go far. Follow the other, and you’ll die powerless and broke. The only question is, which is which?
Of all the issues that preoccupy the modern mind—Nature or nurture? Is there life in outer space? Why can’t America field a decent soccer team?—it’s hard to think of one that has attracted so much water-cooler philosophizing yet so little scientific inquiry. Does it pay to be nice? Or is there an advantage to being a jerk?
The plight of non-tenured professors is widely known, but what about the impact they have on the students they’re hired to instruct?
Imagine meeting your English professor by the trunk of her car for office hours, where she doles out information like a taco vendor in a food truck. Or getting an e-mail error message when you write your former biology professor asking for a recommendation because she is no longer employed at the same college. Or attending an afternoon lecture in which your anthropology professor seems a little distracted because he doesn’t have enough money for bus fare. This is an increasingly widespread reality of college education.
Many students—and parents who foot the bills—may assume that all college professors are adequately compensated professionals with a distinct arrangement in which they have a job for life. In actuality those are just tenured professors, who represent less than a quarter of all college faculty. Odds are that students will be taught by professors with less job security and lower pay than those tenured employees, which research shows results in diminished services for students.
Soccer’s international governing body has long been suspected of mass corruption, but a 47-count U.S. indictment is one of the first real steps to accountability.
Imagine this: A shadowy multinational syndicate, sprawling across national borders but keeping its business quiet. Founded in the early 20th century, it has survived a tumultuous century, gradually expanding its power. It cuts deals with national governments and corporations alike, and has a hand in a range of businesses. Some are legitimate; others are suspected of beings little more than protection rackets or vehicles for kickbacks. Nepotism is rampant. Even though it’s been widely rumored to be a criminal enterprise for years, it has used its clout to cow the justice system into leaving it alone. It has branches spread across the globe, arranged in an elaborate hierarchical system. Its top official, both reviled and feared and demanding complete fealty, is sometimes referred to as the godfather.
In most states, where euthanasia is illegal, physicians can offer only hints and euphemisms for patients to interpret.
SAN FRANCISCO—Physician-assisted suicide is illegal in all but five states. But that doesn’t mean it doesn’t happen in the rest. Sick patients sometimes ask for help in hastening their deaths, and some doctors will hint, vaguely, how to do it.
This leads to bizarre, veiled conversations between medical professionals and overwhelmed families. Doctors and nurses want to help but also want to avoid prosecution, so they speak carefully, parsing their words. Family members, in the midst of one of the most confusing and emotional times of their lives, are left to interpret euphemisms.
That’s what still frustrates Hope Arnold. She says throughout the 10 months her husband J.D. Falk was being treated for stomach cancer in 2011, no one would talk straight with them.
In any case, people have probably heard the phrase in reference to something gone awry at work or in life. In either setting, when the shit does hit the fan, people will tend to look to the most competent person in the room to take over.
And too bad for that person. A new paper by a team of researchers from Duke University, University of Georgia, and University of Colorado looks at not only how extremely competent people are treated by their co-workers and peers, but how those people feel when, at crucial moments, everyone turns to them. They find that responsible employees are not terribly pleased about this dynamic either.
Kalaupapa, Hawaii, is a former leprosy colony that’s still home to several of the people who were exiled there through the 1960s. Once they all pass away, the federal government wants to open up the isolated peninsula to tourism. But at what cost?
Not so long ago, people in Hawaii who were diagnosed with leprosy were exiled to an isolated peninsula attached to one of the tiniest and least-populated islands. Details on the history of the colony—known as Kalaupapa—for leprosy patients are murky: Fewer than 1,000 of the tombstones than span across the village’s various cemeteries are marked, many of them having succumbed to weather damage or invasive vegetation. A few have been nearly devoured by trees. But records suggest that at least 8,000 individuals were forcibly removed from their families and relocated to Kalaupapa over a century starting in the 1860s. Almost all of them were Native Hawaiian.
Sixteen of those patients, ages 73 to 92, are still alive. They include six who remain in Kalaupapa voluntarily as full-time residents, even though the quarantine was lifted in 1969—a decade after Hawaii became a state and more than two decades after drugs were developed to treat leprosy, today known as Hansen’s disease. The experience of being exiled was traumatic, as was the heartbreak of abandonment, for both the patients themselves and their family members. Kalaupapa is secluded by towering, treacherous sea cliffs from the rest of Molokai—an island with zero traffic lights that takes pride in its rural seclusion—and accessing it to this day remains difficult. Tourists typically arrive via mule. So why didn’t every remaining patient embrace the new freedom? Why didn’t everyone reconnect with loved ones and revel in the conveniences of civilization? Many of Kalaupapa’s patients forged paradoxical bonds with their isolated world. Many couldn’t bear to leave it. It was “the counterintuitive twinning of loneliness and community,” wrote The New York Times in 2008. “All that dying and all of that living.”