Whether it's the 1930s or the 2010s, depressions are the only casualties in a currency war
I don't know how it compares to peeing in your bed, as one anonymous senior Fed official put it, but a currency war is one of the surest ways to end a global slump. Despite what you may have heard, it was a big part of what stopped the vicious circle of the Great Depression.
Currency wars are the best type of wars. Nobody dies, and everybody can recover, as long as everybody plays along. Here's how it works. One country devalues its currency -- in other words, prints money -- which, in a time of weak global demand, puts pressure on other countries to do the same, lest they lose out on trade. Then another country devalues, and so on, in a cascade of looser money. It's the invisible hand pushing for expansionary monetary policy when it's needed most.
But there are a few caveats. For one, a currency war only makes sense during a global depression when short-term interest rates are mostly stuck at zero. It's about boosting monetary stimulus when conventional methods are out of ammo. For another, devaluing forever (a là China) is not a sustainable growth strategy. It might make sense for developing nations to subsidize export industries early on, but, eventually, this will only cause imbalances to build up, while robbing the domestic population of purchasing power. And finally, there's a risk that a currency war could turn into a trade war. In other words, countries will retaliate to expansionary monetary policy not with expansionary monetary policy of their own, but with tariffs. Presumably that's what our silver-tongued senior Fed official was getting at with this head-scratcher of a quote:
Devaluing a currency is like peeing in bed. It feels good at first, but pretty soon it becomes a real mess.
This fear of a currency war begetting a trade war is certainly serious, but it's made to sound more serious thanks to some bad history. Here's the erroneous story you might have heard (especially now that Japan's talk of more aggressive easing has revived fears of a currency war):
After the Great Crash of 1929, countries abandoned the gold standard and devalued their currencies in a beggar-thy-neighbor battle to the bottom. This currency war turned into a trade war, with countries eventually resorting to tariffs and counter-tariffs, as they tried to grab a hold on an ever-shrinking pie of demand. The consequent collapse in world trade is what made the Great Depression so great, and set the stage for the trade war to turn into an actual one.
Scary stuff. But not quite true. The reality is the trade war started before the currency war, and the latter jump-started recovery wherever it was tried. The infamous Smoot-Hawley tariff in the U.S., the first salvo in the trade war to come, was actually passed in June 1930, more than a full year before any country devalued its currency. It wasn't until September 1931 that Britain abandoned the gold standard ... and that's when things get a bit complicated. It's hard to accuse Britain of "competitive" devaluation here, because it had no choice but devaluation; it had simply run out of gold. Nonetheless, other countries responded to Britain's increased competitiveness by increasing their trade barriers; in this case, the currency war, such as it was, did exacerbate the ongoing trade war, as Gavyn Davies of the Financial Times points out.
But then a funny thing happened. The punishment for Britain's economic weakness was a recovery. Ditching gold gave Britain (and everybody else who did so) the freedom to pursue more aggressive monetary and fiscal policies than the "rules of the game" of the gold standard had allowed.* As you can see in the chart below (via Brad DeLong) from Barry Eichengreen's magisterial work on the depression, Golden Fetters, recovery followed devaluation everywhere. There was no reward for financial orthodoxy in the 1930s. The countries that stayed with the gold standard the longest, the so-called Gold Bloc of France, Belgium, and Poland, were the last to begin growing again. In other words, the currency war didn't deepen the depression; it ended it.
And that brings us to one last, stupid question. How did beggar-thy-neighbor policies kickstart growth even after world trade had already collapsed? In other words, how did stealing a trade advantage help so much when there wasn't much trade to steal? Well, it's not entirely, or even mostly, about stealing trade. Indeed, as Scott Sumner points out, the U.S. trade balance actually worsened in 1933 after FDR took us off gold, even as the economy quickly reversed its death-spiral and began a virtuous cycle. It's easiest to frame devaluation as grabbing demand from abroad, but it's really about increasing demand at home. Devaluation means printing money, and more money during a liquidity trap means more demand, period. It also allows more stimulus spending than a fixed-exchange rate system (like the gold standard) would. The next time you hear someone lamenting the "destructive devaluations that followed the Great Depression," remember to ask them -- what was so destructive about ending the most destructive depression in modern history?
The only thing we have to fear is fear of currency wars itself. Depressions are the only casualties in these kind of conflicts.
* There were two exceptions. The gold standard did not constrain looser monetary policy in the U.S. and France in the early years of the depression, as both had more than enough gold to back more credit growth, but chose instead to sterilize their gold inflows out of fear of nonexistent inflation in the face of actual deflation. This stockpiling drained everybody else of gold, and consequently made staying on the gold standard impossible. Even the U.S. and France had to eventually abandon it to reverse years of deflation.
Thicker ink, fewer smudges, and more strained hands: an Object Lesson
Recently, Bic launched acampaign to “save handwriting.” Named “Fight for Your Write,” it includes a pledge to “encourage the act of handwriting” in the pledge-taker’s home and community, and emphasizes putting more of the company’s ballpoints into classrooms.
As a teacher, I couldn’t help but wonder how anyone could think there’s a shortage. I find ballpoint pens all over the place: on classroom floors, behind desks. Dozens of castaways collect in cups on every teacher’s desk. They’re so ubiquitous that the word “ballpoint” is rarely used; they’re just “pens.” But despite its popularity, the ballpoint pen is relatively new in the history of handwriting, and its influence on popular handwriting is more complicated than the Bic campaign would imply.
Most campaign ads, like most billboards or commercials, are unimaginative and formulaic. Our candidate is great! Their candidate is terrible! Choose us!
With the huge majority of political ads, you would look back on them long after the campaign only for time-warp curio purposes—Look at the clothes they wore in the 80s! Look how corny “I like Ike!” was as a slogan! Look how young [Mitch McConnell / Bill Clinton / Al Gore] once was!—or to find archeological samples of the political mood of a given era.
The few national-campaign ads that are remembered earn their place either because they were so effective in shifting the tone of the campaign, as with George H. W. Bush’s race-baiting “Willie Horton” ad against Michael Dukakis in 1988; or because they so clearly presented the candidate in the desired light, as with Ronald Reagan’s famous “Morning in America” ad in 1984. Perhaps the most effective campaign advertisement ever, especially considering that it was aired only one time, was Lyndon Johnson’s devastating “Daisy Girl” ad, from his campaign against Barry Goldwater in 1964. The power of the Daisy Girl ad was of course its dramatizing the warning that Goldwater might recklessly bring on a nuclear war.
The Texas senator’s about-face risks undermining his political brand and alienating the supporters who hailed his defiant stand in Cleveland.
Ted Cruz set aside his many differences with Donald Trump on Friday to endorse for president a man whom he once called a “serial philanderer,” a “pathological liar,” “utterly amoral,” and a “sniveling coward”; who insulted his wife’s looks; who insinuated Cruz’s father was involved in the assassination of John F. Kennedy; who said he wouldn’t even accept his endorsement; and who for months mocked him mercilessly with a schoolyard taunt, “Lyin’ Ted.”
The Texas senator announced his support for the Republican nominee late Friday afternoon in a Facebook post, writing that the possibility of a Hillary Clinton presidency was “wholly unacceptable” and that he was keeping his year-old commitment to back the party’s choice. Cruz listed six policy-focused reasons why he was backing Trump, beginning with the importance of appointing conservatives to the Supreme Court and citing Trump’s recently expanded list of potential nominees. Other reasons included Obamacare—which Trump has vowed to repeal—immigration, national security, and Trump’s newfound support for Cruz’s push against an Obama administration move to relinquish U.S. oversight of an internet master directory of web addresses.
Who will win the debates? Trump’s approach was an important part of his strength in the primaries. But will it work when he faces Clinton onstage?
The most famous story about modern presidential campaigning now has a quaint old-world tone. It’s about the showdown between Richard Nixon and John F. Kennedy in the first debate of their 1960 campaign, which was also the very first nationally televised general-election debate in the United States.
The story is that Kennedy looked great, which is true, and Nixon looked terrible, which is also true—and that this visual difference had an unexpected electoral effect. As Theodore H. White described it in his hugely influential book The Making of the President 1960, which has set the model for campaign coverage ever since, “sample surveys” after the debate found that people who had only heard Kennedy and Nixon talking, over the radio, thought that the debate had been a tie. But those who saw the two men on television were much more likely to think that Kennedy—handsome, tanned, non-sweaty, poised—had won.
How Washington men working in national security dress—for better or for worse
In 2017, shortly after the next president is inaugurated, thousands of newly appointed federal officials will struggle with the same existential question: What do I wear to my first day of work? I understand their anxiety, having languished over wardrobe during eight years of federal service and pondered the fashion choices of my male colleagues during the interminable meetings that are the hallmark of government work. It’s hard to point to a solid “real world” professional competency that I learned during those years of meetings and memo writing, but one skill I developed is an uncanny ability to tell you where any man in the national security community works based on his apparel. But first, to understand the fashion choices these professionals make, you must understand the culture—and keep in mind that not every employee falls into these stereotyped camps. (I’m also leaving a thorough assessment of female fashion to other writers more qualified.)
In Greenwich, Darien, and New Canaan, Connecticut, bankers are earning astonishing amounts. Does that have anything to do with the poverty in Bridgeport, just a few exits away?
BRIDGEPORT, Conn.—Few places in the country illustrate the divide between the haves and the have-nots more than the county of Fairfield, Connecticut. Drive around the city of Bridgeport and, amid the tracts of middle-class homes, you’ll see burned-out houses, empty factories, and abandoned buildings that line the main street. Nearby, in the wealthier part of the county, there are towns of mansions with leafy grounds, swimming pools, and big iron gates.
Bridgeport, an old manufacturing town all but abandoned by industry, and Greenwich, a headquarters to hedge funds and billionaires, may be in the same county, and a few exits apart from each other on I-95, but their residents live in different worlds. The average income of the top 1 percent of people in the Bridgeport-Stamford-Norwalk metropolitan area, which consists of all of Fairfield County plus a few towns in neighboring New Haven County, is $6 million dollars—73 times the average of the bottom 99 percent—according to a report released by the Economic Policy Institute (EPI) in June. This makes the area one of the most unequal in the country; nationally, the top 1 percent makes 25 times more than the average of the bottom 99 percent.
Even in big cities like Tokyo, small children take the subway and run errands by themselves. The reason has a lot to do with group dynamics.
It’s a common sight on Japanese mass transit: Children troop through train cars, singly or in small groups, looking for seats.
They wear knee socks, polished patent-leather shoes, and plaid jumpers, with wide-brimmed hats fastened under the chin and train passes pinned to their backpacks. The kids are as young as 6 or 7, on their way to and from school, and there is nary a guardian in sight.
A popular television show called Hajimete no Otsukai, or My First Errand, features children as young as two or three being sent out to do a task for their family. As they tentatively make their way to the greengrocer or bakery, their progress is secretly filmed by a camera crew. The show has been running for more than 25 years.
Early photographs of the architecture and culture of Peking in the 1870s
In May of 1870, Thomas Child was hired by the Imperial Maritime Customs Service to be a gas engineer in Peking (Beijing). The 29-year-old Englishman left behind his wife and three children to become one of roughly 100 foreigners living in the late Qing dynasty's capital, taking his camera along with him. Over the course of the next 20 years, he took some 200 photographs, capturing the earliest comprehensive catalog of the customs, architecture, and people during China's last dynasty. On Thursday, an exhibition of his images will open at the Sidney Mishkin Gallery in New York, curated by Stacey Lambrow. In addition, descendants of the subjects of one of his most famous images, Bride and Bridegroom (1870s), will be in attendance.
Taran Killam, Jay Pharoah, and Jon Rudnitsky have been fired from the venerable sketch show, which has failed to find its feet with a new cast.
Saturday Night Live has had roughly the same cast for the last three seasons; in the typical arc of the long-running series, this is when the series would be hitting its stride. Over its 41 seasons on the air, SNL has had peaks and valleys, but each new cast usually comes into its own after a couple of years building experience. This time around, that hasn’t happened. Perhaps as a result, the show announced Tuesday that it wasn’t picking up the contracts of three cast members, a shakeup that suggests things aren’t quite right at 30 Rockefeller Center.
Taran Killam, Jay Pharoah, and Jon Rudnitsky certainly aren’t the first cast members to be summarily let go by SNL’s honcho, Lorne Michaels. For most of its tenure, Saturday Night Live was known for the ruthlessness of its reorganizations. But it has also enjoyed a near-unprecedented run of good fortune over the last decade: The transition from the late-’90s cast led by Will Ferrell, Molly Shannon, and Jimmy Fallon to the mid-aughts ensemble fronted by Kristen Wiig, Andy Samberg, and Bill Hader was fairly seamless. The show’s reputation for its hostile cliques changed considerably during this time, but, as the current firings suggest, the lightened mood only lasts as long as the sketches are funny. If history is any indicator, there’s a chance that SNL’s shakeup may not have been brutal enough.
Police in Charlotte, North Carolina, released body-cam and dashboard footage of the 43-year-old black man’s final moments.
Keith Scott had his hands at his side when a Charlotte, North Carolina, police officer fatally shot him four times, according to footage from a police dashboard camera.
The Charlotte-Mecklenburg Police Department released Saturday clips of body-cam and dashboard footage taken during Scott’s shooting Tuesday after days of protests in downtown Charlotte over the killing.
The two clips offer an incomplete glimpse into the encounter. Footage from the body-cam of one of the officers runs a minute long. Scott himself is shown for only a fraction of a second in it. During the shooting itself, the lens is obscured by the officer’s neck. The audio is also missing from the first 25 seconds, including when the gunshots are fired.