Recoveries have been getting weaker and weaker because that's how the Fed wants them
It's time to talk about everybody's least favorite Davos buzzword -- New Normal.
With GDP unexpectedly contracting 0.1 percent in the fourth quarter of 2012 (though the private sector mostly kept up, despite the obstacles we've thrown in its way), it's enough to make you wonder if this time really is different. In other words, has the economy settled into a, well, new normal of slower growth?
If it has, it's not quite new, at least when it comes to recoveries. As you can see in this Minneapolis Fed chart of job gains following recessions, something changed after 1981. Recoveries went from being V-shaped affairs characterized by rapid bouncebacks in employment to U-shaped ones better described as nasty, brutish, and long.
(Note: I excluded the recovery from the 1980 recession, because the double-dip in 1981 cut it short).
The story of the jobless recovery is one of what the Fed isn't doing. As Paul Krugman points out, recessions have become post-(or perhaps pre-) modern. Through the 1980s, postwar recessions happened when the Fed decided to raise rates to head off inflation, and recoveries happened when the Fed decided things had tamed down enough to lower rates. But now recessions happen when bubbles burst, with financial deregulation and the global savings glut making these more of a recurring feature of our economy, and the Fed hasn't been able to cut interest rates enough to generate strong post-crash recoveries. Or maybe it hasn't wanted to.
Here's a stupid question. Why have interest rates and inflation mostly been falling for the past 30 years? In other words if the Fed has been de facto, and later de jure, targeting inflation for most of this period (and it has), why has inflation been on a down trend (and it has)? As you can see in the chart below, core PCE inflation, which excludes food and energy costs, fell substantially from the Reagan recovery through the bursting tech bubble, and has more or less held steady since, though a bit more on the less side recently.
Say hello to "opportunistic disinflation." Okay, let's translate this from Fed-ese. Remember, the Fed is supposed to target 2 percent inflation, meaning it raises rates when prices rise by more than that much and lowers them once the economy's cooled off enough, but it wasn't always so. Back in the mid-1980s, inflation was hovering around 4 percent, a major achievement following the stagflation of the previous decade, but the Fed wanted it to go lower -- here's the crucial bit -- without taking the blame for it. The Volcker Fed had come in for quite a bit of abuse when it whipped inflation at the expense of the severe 1981-82 downturn, and the Fed seems to have learned it was better not to leave its fingerprints on the business cycle.
In other words, Let recessions do their dirty work for them.
It's not hard for central bankers to get what they want without doing anything, as long as what they want is less inflation (and that's almost always what central bankers want). They just have to wait for a recession to come along ... and then keep waiting until inflation falls to where they want it. Then, once prices have declined enough for their taste, they cut rates (or buy bonds) to stabilize inflation at this new, lower level. But it's one thing to stabilize inflation at a lower level; it's another to keep it there. The Fed has to raise rates faster than it otherwise would during the subsequent recovery to keep inflation from going back to where it was before the recession. It's what the Fed calls "opportunistic disinflation," and it's hard to believe this wasn't their strategy looking at falling inflation the previous few decades. Not that we have to guess. Fed president Edward Boehene actually laid out this approach in 1989, and Fed governor Laurence Meyer endorsed the idea of "reducing inflation cycle-to-cycle" in a 1996 speech -- the same year the Wall Street Journal leaked an internal Fed memo outlining the policy.
In short: Recoveries have been jobless, because that's how the Fed likes them.
But it gets worse. Pushing inflation progressively lower means recoveries get progressively weaker, since the Fed has to choke off inflation, and hence the recovery, at lower and lower levels. Now, to be fair, the Fed, and Ben Bernanke in particular, have awoken to the dangers of this approach. The danger, of course, is that the Fed gets in a situation where short-term rates are stuck at zero, but the economy stays stuck in a slump. Sound familiar? Bernanke realized this was a threat in 2002 when the economy was flirting with deflation despite 1.34 interest rates, and vowed not to let it happen here. (Remember, "disinflation" means falling inflation, and "deflation" means negative inflation).
The Fed, of course, did let it happen here. But it didn't let prices actually start to fall, which would make debt and borrowing more expensive at the worst possible moment, due to the Fed's bond-buying and to wages that are sticky downwards. Bernanke got the Fed to accept that opportunistic disinflation had gone too far with QE1 and QE2, but it's not clear that he's gotten them to give up on the idea altogether. Core inflation has settled in below 2 percent, and the Fed's economic projections don't show it rising above that level anytime soon. That's pushed nominal GDP growth -- the growth of the total size of the economy -- down to 4 percent for each of the past three years; a low level the Fed is apparently comfortable with. Bernanke seems to be trying to shift the consensus towards undoing some of this disinflation -- unlike previous rounds of bond-buying, QE3 was aimed at lowering unemployment, and not stopping lower prices, while the Evans rule explicitly says the Fed will tolerate inflation up to 2.5 percent -- but there's been no shift in the data so far. The Fed needs to realize there is no try when it comes to reflation. It has to promise to do whatever it takes.
The new normal doesn't have to be new or normal if the Fed doesn't want it to be.
Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.
Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.
But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.
Writing used to be a solitary profession. How did it become so interminably social?
Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.
Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.
MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.
Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.
Most of the big names in futurism are men. What does that mean for the direction we’re all headed?
In the future, everyone’s going to have a robot assistant. That’s the story, at least. And as part of that long-running narrative, Facebook just launched its virtual assistant. They’re calling it Moneypenny—the secretary from the James Bond Films. Which means the symbol of our march forward, once again, ends up being a nod back. In this case, Moneypenny is a send-up to an age when Bond’s womanizing was a symbol of manliness and many women were, no matter what they wanted to be doing, secretaries.
Why can’t people imagine a future without falling into the sexist past? Why does the road ahead keep leading us back to a place that looks like the Tomorrowland of the 1950s? Well, when it comes to Moneypenny, here’s a relevant datapoint: More than two thirds of Facebook employees are men. That’s a ratio reflected among another key group: futurists.
During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.
During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.
Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.
The Wall Street Journal’s eyebrow-raising story of how the presidential candidate and her husband accepted cash from UBS without any regard for the appearance of impropriety that it created.
The Swiss bank UBS is one of the biggest, most powerful financial institutions in the world. As secretary of state, Hillary Clinton intervened to help it out with the IRS. And after that, the Swiss bank paid Bill Clinton $1.5 million for speaking gigs. TheWall Street Journal reported all that and more Thursday in an article that highlights huge conflicts of interest that the Clintons have created in the recent past.
The piece begins by detailing how Clinton helped the global bank.
“A few weeks after Hillary Clinton was sworn in as secretary of state in early 2009, she was summoned to Geneva by her Swiss counterpart to discuss an urgent matter. The Internal Revenue Service was suing UBS AG to get the identities of Americans with secret accounts,” the newspaper reports. “If the case proceeded, Switzerland’s largest bank would face an impossible choice: Violate Swiss secrecy laws by handing over the names, or refuse and face criminal charges in U.S. federal court. Within months, Mrs. Clinton announced a tentative legal settlement—an unusual intervention by the top U.S. diplomat. UBS ultimately turned over information on 4,450 accounts, a fraction of the 52,000 sought by the IRS.”
Some say the so-called sharing economy has gotten away from its central premise—sharing.
This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.
The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”
The Vermont senator’s revolutionary zeal has met its moment.
There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!
And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.
He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.
Last week, the U.S. finally received some good news in Syria:.After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.
The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.