Recoveries have been getting weaker and weaker because that's how the Fed wants them
It's time to talk about everybody's least favorite Davos buzzword -- New Normal.
With GDP unexpectedly contracting 0.1 percent in the fourth quarter of 2012 (though the private sector mostly kept up, despite the obstacles we've thrown in its way), it's enough to make you wonder if this time really is different. In other words, has the economy settled into a, well, new normal of slower growth?
If it has, it's not quite new, at least when it comes to recoveries. As you can see in this Minneapolis Fed chart of job gains following recessions, something changed after 1981. Recoveries went from being V-shaped affairs characterized by rapid bouncebacks in employment to U-shaped ones better described as nasty, brutish, and long.
(Note: I excluded the recovery from the 1980 recession, because the double-dip in 1981 cut it short).
The story of the jobless recovery is one of what the Fed isn't doing. As Paul Krugman points out, recessions have become post-(or perhaps pre-) modern. Through the 1980s, postwar recessions happened when the Fed decided to raise rates to head off inflation, and recoveries happened when the Fed decided things had tamed down enough to lower rates. But now recessions happen when bubbles burst, with financial deregulation and the global savings glut making these more of a recurring feature of our economy, and the Fed hasn't been able to cut interest rates enough to generate strong post-crash recoveries. Or maybe it hasn't wanted to.
Here's a stupid question. Why have interest rates and inflation mostly been falling for the past 30 years? In other words if the Fed has been de facto, and later de jure, targeting inflation for most of this period (and it has), why has inflation been on a down trend (and it has)? As you can see in the chart below, core PCE inflation, which excludes food and energy costs, fell substantially from the Reagan recovery through the bursting tech bubble, and has more or less held steady since, though a bit more on the less side recently.
Say hello to "opportunistic disinflation." Okay, let's translate this from Fed-ese. Remember, the Fed is supposed to target 2 percent inflation, meaning it raises rates when prices rise by more than that much and lowers them once the economy's cooled off enough, but it wasn't always so. Back in the mid-1980s, inflation was hovering around 4 percent, a major achievement following the stagflation of the previous decade, but the Fed wanted it to go lower -- here's the crucial bit -- without taking the blame for it. The Volcker Fed had come in for quite a bit of abuse when it whipped inflation at the expense of the severe 1981-82 downturn, and the Fed seems to have learned it was better not to leave its fingerprints on the business cycle.
In other words, Let recessions do their dirty work for them.
It's not hard for central bankers to get what they want without doing anything, as long as what they want is less inflation (and that's almost always what central bankers want). They just have to wait for a recession to come along ... and then keep waiting until inflation falls to where they want it. Then, once prices have declined enough for their taste, they cut rates (or buy bonds) to stabilize inflation at this new, lower level. But it's one thing to stabilize inflation at a lower level; it's another to keep it there. The Fed has to raise rates faster than it otherwise would during the subsequent recovery to keep inflation from going back to where it was before the recession. It's what the Fed calls "opportunistic disinflation," and it's hard to believe this wasn't their strategy looking at falling inflation the previous few decades. Not that we have to guess. Fed president Edward Boehene actually laid out this approach in 1989, and Fed governor Laurence Meyer endorsed the idea of "reducing inflation cycle-to-cycle" in a 1996 speech -- the same year the Wall Street Journal leaked an internal Fed memo outlining the policy.
In short: Recoveries have been jobless, because that's how the Fed likes them.
But it gets worse. Pushing inflation progressively lower means recoveries get progressively weaker, since the Fed has to choke off inflation, and hence the recovery, at lower and lower levels. Now, to be fair, the Fed, and Ben Bernanke in particular, have awoken to the dangers of this approach. The danger, of course, is that the Fed gets in a situation where short-term rates are stuck at zero, but the economy stays stuck in a slump. Sound familiar? Bernanke realized this was a threat in 2002 when the economy was flirting with deflation despite 1.34 interest rates, and vowed not to let it happen here. (Remember, "disinflation" means falling inflation, and "deflation" means negative inflation).
The Fed, of course, did let it happen here. But it didn't let prices actually start to fall, which would make debt and borrowing more expensive at the worst possible moment, due to the Fed's bond-buying and to wages that are sticky downwards. Bernanke got the Fed to accept that opportunistic disinflation had gone too far with QE1 and QE2, but it's not clear that he's gotten them to give up on the idea altogether. Core inflation has settled in below 2 percent, and the Fed's economic projections don't show it rising above that level anytime soon. That's pushed nominal GDP growth -- the growth of the total size of the economy -- down to 4 percent for each of the past three years; a low level the Fed is apparently comfortable with. Bernanke seems to be trying to shift the consensus towards undoing some of this disinflation -- unlike previous rounds of bond-buying, QE3 was aimed at lowering unemployment, and not stopping lower prices, while the Evans rule explicitly says the Fed will tolerate inflation up to 2.5 percent -- but there's been no shift in the data so far. The Fed needs to realize there is no try when it comes to reflation. It has to promise to do whatever it takes.
The new normal doesn't have to be new or normal if the Fed doesn't want it to be.
Take a walk along West Florissant Avenue, in Ferguson, Missouri. Head south of the burned-out Quik Trip and the famous McDonalds, south of the intersection with Chambers, south almost to the city limit, to the corner of Ferguson Avenue and West Florissant. There, last August, Emerson Electric announced third-quarter sales of $6.3 billion. Just over half a mile to the northeast, four days later, Officer Darren Wilson killed Michael Brown. The 12 shots fired by Officer Wilson were probably audible in the company lunchroom.
Outwardly, at least, the City of Ferguson would appear to occupy an enviable position. It is home to a Fortune 500 firm. It has successfully revitalized a commercial corridor through its downtown. It hosts an office park filled with corporate tenants. Its coffers should be overflowing with tax dollars.
Freddie Gray's death on April 19 leaves many unanswered questions. But it is clear that when Gray was arrested in West Baltimore on the morning of April 12, he was struggling to walk. By the time he arrived at the police station a half hour later, he was unable to breathe or talk, suffering from wounds that would kill him.*
Gray died Sunday from spinal injuries. Baltimore authorities say they're investigating how the 25-year-old was hurt—a somewhat perverse notion, given that it was while he was in police custody, and hidden from public view, that he apparently suffered injury. How it happened remains unknown. It's even difficult to understand why officers arrested Gray in the first place. But with protestors taking to the streets of Baltimore since Gray's death on Sunday, the incident falls into a line of highly publicized, fatal encounters between black men and the police. Meanwhile, on Tuesday, a reserve sheriff's deputy in Tulsa, Oklahoma, pleaded not guilty to a second-degree manslaughter charge in the death of a man he shot. The deputy says the shooting happened while he was trying to tase the man. Black men dying at the hands of the police is of course nothing new, but the nation is now paying attention and getting outraged.
After a five-month delay, Loretta Lynch made history last week. On Thursday, the Senate confirmed Lynch as the next U.S. attorney general, the first African American woman ever to hold this Cabinet position. Her long-stalled nomination sometimes seemed in doubt, held hostage to partisan jockeying between Democrats and Republicans. But one political bloc never gave up, relentlessly rallying its support behind Lynch: the black sorority.
During her initial hearing, the seats behind Lynch were filled with more than two dozen of her Delta Sigma Theta Sorority sisters arrayed in crimson-and-cream blazers and blouses, ensuring their visibility on the national stage. These Delta women—U.S. Representatives Marcia Fudge and Joyce Beatty among them—were there to lend moral support and show the committee that they meant business. The Deltas were not alone. The Lynch nomination also drew support from congressional representatives from other black sororities: Alpha Kappa Alpha members Terri Sewell and Sheila Jackson Lee took to the House floor to advocate for a vote while Sigma Gamma Rho members Corinne Brown and Robin Kelly and Zeta Phi Beta member Donna Edwards used social media and press conferences to campaign on Lynch’s behalf.
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
A lot of Internet ink has been spilled over how lazy and entitled Millennials are, but when it comes to paying for a college education, work ethic isn't the limiting factor. The economic cards are stacked such that today’s average college student, without support from financial aid and family resources, would need to complete 48 hours of minimum-wage work a week to pay for his courses—a feat that would require superhuman endurance, or maybe a time machine.
To take a close look at the tuition history of almost any institution of higher education in America is to confront an unfair reality: Each year’s crop of college seniors paid a little bit more than the class that graduated before. The tuition crunch never fails to provide new fodder for ongoing analysis of the myths and realities of The American Dream. Last week, a graduate student named Randy Olson listened to his grandfather extol the virtues of putting oneself through college without family support. But paying for college without family support is a totally different proposition these days, Olson thought. It may have been feasible 30 years ago, or even 15 years ago, but it's much harder now.
Hours after a major earthquake wreaked havoc across his country, Nepali Information Minister Minendra Rijal appeared at a news conference on Saturday to announce that schools would be closed for the next five days. "We never imagined we'd face such devastation," he said.
But for geologists, Saturday's disaster—which has claimed over 2,400 lives—was sadly predictable.
"Physically and geologically what happened is exactly what we thought would happen," James Jackson, head of the earth-sciences department at the University of Cambridge, told the Associated Press.
Blessed with stunning natural scenery, Nepal is a popular tourist destination that attracts hundreds of thousands of travelers each year. But the source of the country's beauty is what makes it particularly vulnerable to earthquakes. Much of Nepal's population lives in a valley beneath the Himalayas, a mountain range formed by collisions between the Indian and Central Asian tectonic plates. These collisions—which occur when the Indian plate slides underneath its much larger neighbor—are what cause earthquakes. According to The Washington Post, a chunk of the earth measuring 75 by 37 miles shifted 10 feet in 30 seconds on Saturday, destroying much of what lay atop the surface.
I’m not a dog person. I prefer cats. Cats make you work to have a relationship with them, and I like that. But I have adopted several dogs, caving in to pressure from my kids. The first was Teddy, a rottweiler-chow mix whose bushy hair was cut into a lion mane. Kids loved him, and he grew on me, too. Teddy was probably ten years when we adopted him. Five years later he had multiple organs failing and it was time to put him to sleep.
When I arrived at the vet, he said I could drop him off. I was aghast. No. I needed to stay with Teddy.As the vet prepped the syringe to put him to sleep, I started sobbing. The vet gave me a couple minutes to collect myself and say goodbye. I held Teddy's paw until he died. Honestly, I didn't think I was that attached.
In her new book No One Understands You and What To Do About It, Heidi Grant Halvorson tells readers a story about her friend, Tim. When Tim started a new job as a manager, one of his top priorities was communicating to his team that he valued each member’s input. So at team meetings, as each member spoke up about whatever project they were working on, Tim made sure he put on his “active-listening face” to signal that he cared about what each person was saying.
But after meeting with him a few times, Tim’s team got a very different message from the one he intended to send. “After a few weeks of meetings,” Halvorson explains, “one team member finally summoned up the courage to ask him the question that had been on everyone’s mind.” That question was: “Tim, are you angry with us right now?” When Tim explained that he wasn’t at all angry—that he was just putting on his “active-listening face”—his colleague gently explained that his active-listening face looked a lot like his angry face.
Soon, thousand of police officers across the country will don body-worn cameras when they go out among the public. Those cameras will generate millions of hours of footage—intimate views of commuters receiving speeding tickets, teens getting arrested for marijuana possession, and assault victims at some of the worst moments of their lives.
As the Washington Post and the Associated Press have reported, lawmakers in at least 15 states have proposed exempting body-cam footage from local open records laws. But the flurry of lawmaking speaks to a larger crisis: Once those millions of hours of footage have been captured, no one is sure what to do with them.
I talked to several representatives from privacy, civil rights, and progressive advocacy groups working on body cameras. Even among these often allied groups, there’s little consensus about the kind of policies that should exist around releasing footage.
In Baltimore, where 25-year-old Freddie Gray died shortly after being taken into police custody, an investigation may uncover homicidal misconduct by law enforcement, as happened in the North Charleston, South Carolina, killing of Walter Scott. Or the facts may confound the darkest suspicions of protestors, as when the Department of Justice released its report on the killing of Michael Brown.
What's crucial to understand, as Baltimore residents take to the streets in long-simmering frustration, is that their general grievances are valid regardless of how this case plays out. For as in Ferguson, where residents suffered through years of misconduct so egregious that most Americans could scarcely conceive of what was going on, the people of Baltimore are policed by an entity that perpetrates stunning abuses. The difference is that this time we needn't wait for a DOJ report to tell us so. Harrowing evidence has been presented. Yet America hasn't looked.