Recoveries have been getting weaker and weaker because that's how the Fed wants them
It's time to talk about everybody's least favorite Davos buzzword -- New Normal.
With GDP unexpectedly contracting 0.1 percent in the fourth quarter of 2012 (though the private sector mostly kept up, despite the obstacles we've thrown in its way), it's enough to make you wonder if this time really is different. In other words, has the economy settled into a, well, new normal of slower growth?
If it has, it's not quite new, at least when it comes to recoveries. As you can see in this Minneapolis Fed chart of job gains following recessions, something changed after 1981. Recoveries went from being V-shaped affairs characterized by rapid bouncebacks in employment to U-shaped ones better described as nasty, brutish, and long.
(Note: I excluded the recovery from the 1980 recession, because the double-dip in 1981 cut it short).
The story of the jobless recovery is one of what the Fed isn't doing. As Paul Krugman points out, recessions have become post-(or perhaps pre-) modern. Through the 1980s, postwar recessions happened when the Fed decided to raise rates to head off inflation, and recoveries happened when the Fed decided things had tamed down enough to lower rates. But now recessions happen when bubbles burst, with financial deregulation and the global savings glut making these more of a recurring feature of our economy, and the Fed hasn't been able to cut interest rates enough to generate strong post-crash recoveries. Or maybe it hasn't wanted to.
Here's a stupid question. Why have interest rates and inflation mostly been falling for the past 30 years? In other words if the Fed has been de facto, and later de jure, targeting inflation for most of this period (and it has), why has inflation been on a down trend (and it has)? As you can see in the chart below, core PCE inflation, which excludes food and energy costs, fell substantially from the Reagan recovery through the bursting tech bubble, and has more or less held steady since, though a bit more on the less side recently.
Say hello to "opportunistic disinflation." Okay, let's translate this from Fed-ese. Remember, the Fed is supposed to target 2 percent inflation, meaning it raises rates when prices rise by more than that much and lowers them once the economy's cooled off enough, but it wasn't always so. Back in the mid-1980s, inflation was hovering around 4 percent, a major achievement following the stagflation of the previous decade, but the Fed wanted it to go lower -- here's the crucial bit -- without taking the blame for it. The Volcker Fed had come in for quite a bit of abuse when it whipped inflation at the expense of the severe 1981-82 downturn, and the Fed seems to have learned it was better not to leave its fingerprints on the business cycle.
In other words, Let recessions do their dirty work for them.
It's not hard for central bankers to get what they want without doing anything, as long as what they want is less inflation (and that's almost always what central bankers want). They just have to wait for a recession to come along ... and then keep waiting until inflation falls to where they want it. Then, once prices have declined enough for their taste, they cut rates (or buy bonds) to stabilize inflation at this new, lower level. But it's one thing to stabilize inflation at a lower level; it's another to keep it there. The Fed has to raise rates faster than it otherwise would during the subsequent recovery to keep inflation from going back to where it was before the recession. It's what the Fed calls "opportunistic disinflation," and it's hard to believe this wasn't their strategy looking at falling inflation the previous few decades. Not that we have to guess. Fed president Edward Boehene actually laid out this approach in 1989, and Fed governor Laurence Meyer endorsed the idea of "reducing inflation cycle-to-cycle" in a 1996 speech -- the same year the Wall Street Journal leaked an internal Fed memo outlining the policy.
In short: Recoveries have been jobless, because that's how the Fed likes them.
But it gets worse. Pushing inflation progressively lower means recoveries get progressively weaker, since the Fed has to choke off inflation, and hence the recovery, at lower and lower levels. Now, to be fair, the Fed, and Ben Bernanke in particular, have awoken to the dangers of this approach. The danger, of course, is that the Fed gets in a situation where short-term rates are stuck at zero, but the economy stays stuck in a slump. Sound familiar? Bernanke realized this was a threat in 2002 when the economy was flirting with deflation despite 1.34 interest rates, and vowed not to let it happen here. (Remember, "disinflation" means falling inflation, and "deflation" means negative inflation).
The Fed, of course, did let it happen here. But it didn't let prices actually start to fall, which would make debt and borrowing more expensive at the worst possible moment, due to the Fed's bond-buying and to wages that are sticky downwards. Bernanke got the Fed to accept that opportunistic disinflation had gone too far with QE1 and QE2, but it's not clear that he's gotten them to give up on the idea altogether. Core inflation has settled in below 2 percent, and the Fed's economic projections don't show it rising above that level anytime soon. That's pushed nominal GDP growth -- the growth of the total size of the economy -- down to 4 percent for each of the past three years; a low level the Fed is apparently comfortable with. Bernanke seems to be trying to shift the consensus towards undoing some of this disinflation -- unlike previous rounds of bond-buying, QE3 was aimed at lowering unemployment, and not stopping lower prices, while the Evans rule explicitly says the Fed will tolerate inflation up to 2.5 percent -- but there's been no shift in the data so far. The Fed needs to realize there is no try when it comes to reflation. It has to promise to do whatever it takes.
The new normal doesn't have to be new or normal if the Fed doesn't want it to be.
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. And according to a new report from the U.K.-based Global Challenges Foundation, they’re much more likely than we might think.
In its annual report on “global catastrophic risk,” the nonprofit debuted a startling statistic: Across the span of their lives, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
Partly that’s because the average person will probably not die in an automobile accident. Every year, one in 9,395 people die in a crash; that translates to about a 0.01 percent chance per year. But that chance compounds over the course of a lifetime. At life-long scales, one in 120 Americans die in an accident.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
DATE: MAY 1, 1994
FROM: DR. HUNTER S. THOMPSON
SUBJECT: THE DEATH OF RICHARD NIXON: NOTES ON THE PASSING
OF AN AMERICAN MONSTER.... HE WAS A LIAR AND A QUITTER,
AND HE SHOULD HAVE BEEN BURIED AT SEA.... BUT HE WAS,
AFTER ALL, THE PRESIDENT.
"And he cried mightily with a strong voice, saying, Babylon the great is fallen, is fallen, and is become the habitation of devils, and the hold of every foul spirit and a cage of every unclean and hateful bird."
Richard Nixon is gone now, and I am poorer for it. He was the real thing -- a political monster straight out of Grendel and a very dangerous enemy. He could shake your hand and stab you in the back at the same time. He lied to his friends and betrayed the trust of his family. Not even Gerald Ford, the unhappy ex-president who pardoned Nixon and kept him out of prison, was immune to the evil fallout. Ford, who believes strongly in Heaven and Hell, has told more than one of his celebrity golf partners that "I know I will go to hell, because I pardoned Richard Nixon."
Garry Marshall's patronizing 'holiday anthology' film boasts a star-studded ensemble, but its characters seem barely human.
It’s hard to know where to begin with Mother’s Day, a misshapen Frankenstein of a movie that feels like it escaped the Hallmark headquarters halfway through its creation and rampaged into theaters, trying to teach audiences how to love. The third in Garry Marshall’s increasingly strange “holiday anthology” series, Mother’s Day isn’t the rom-com hodge-podge that Valentine’s Day was, or the bizarre morass of his follow-up New Year’s Eve. But it does inspire the kind of holy terror that you feel all the way down to your bones, or the revolted tingling that strikes one at a karaoke performance gone tragically wrong.
While it’s aiming for frothiness and fun, Mother’s Day is a patronizing and sickly sweet endeavor that widely misses the mark for its entire 118-minute running time (it feels much longer). The audience gets the sense that there are many Big Truths to be learned: that family harmony is important, that it’s good to accept different lifestyles without judgment, that loss is a natural part of the circle of life. But its overall construction—as a work of cinema—always feels a little off. One character gets a life lesson from a clown at a children’s party, and departs with a hearty “Thanks, clown!” Extras wander in the background and deliver halting bits of expositional dialogue like malfunctioning robots. Half of the lines seem to have been recorded post-production and are practically shouted from off-screen to patch over a narrative that makes little sense. Mother’s Day is bad in the regular ways (e.g. the acting and writing), but also in that peculiar way, where it feels as though the film’s creator has never met actual humans before.
There’s a common perception that women siphon off the wealth of their exes and go on to live in comfort. It’s wrong.
A 38-year-old woman living in Everett, Washington recently told me that nine years ago, she had a well-paying job, immaculate credit, substantial savings, and a happy marriage. When her first daughter was born, she and her husband decided that she would quit her job in publishing to stay home with the baby. She loved being a mother and homemaker, and when another daughter came, she gave up the idea of going back to work.
Seven years later, her husband told her to leave their house, and filed for a divorce she couldn’t afford. “He said he was tired of my medical issues, and unwilling to work on things,” she said, citing her severe rheumatoid arthritis and OCD, both of which she manages with medication. “He kicked me out of my own house, with no job and no home, and then my only recourse was to lawyer up. I’m paying them on credit.” (Some of the men and women quoted in this article have been kept anonymous because they were discussing sensitive financial matters, some of them involving ongoing legal disputes.)
When schools ask applicants about their criminal histories, a veneer of campus safety may come at the expense of educational opportunity.
The long-running “Ban the Box” campaign is now gaining ground at colleges and universities. The movement aims to protect job, and now student, applicants from being asked about their criminal histories and was recently bolstered by President Obama, who is taking executive action to ban the practice at federal agencies. Campus officials say the background question helps them learn as much as possible about prospective students and allows them to take steps to keep everyone on campus safe. But opponents say the question—which requires prospective students to check a box if they have criminal histories—is an undue barrier that harms certain groups of students.
Some colleges routinely ask an optional criminal-background question; some schools are compelled to ask it by the state in which they’re located; and, whether intentional or not, more than 600 colleges and universities ask simply because they use Common App to streamline the admissions process. This year, 920,000 unique applicants used Common App to submit 4 million applications, or 4.4 applications per student, according to the organization. The criminal-background question that Common App asks is:
The justices signed off Thursday on a new procedural rule for warrants targeting computers.
The U.S. Supreme Court approved a new rule Thursday allowing federal judges to issue warrants that target computers outside their jurisdiction, setting the stage for a major expansion of surveillance and hacking powers by federal law-enforcement agencies.
Chief Justice John Roberts submitted the rule to Congress on behalf of the Court as part of the justices’ annual package of changes to the Federal Rules of Criminal Procedure. The rules form the basis of every federal prosecution in the United States.
Under Rule 41’s current incarnation, federal magistrate judges can typically only authorize searches and seizures within their own jurisdiction. Only in a handful of circumstances can judges approve a warrant that reaches beyond their territory—if, for example, they allow federal agents to use a tracking device that could move through multiple judicial districts.
In Trump’s aftermath, his enemies on the right will have to take stock and propose a meaningful alternative vision for the GOP’s future.
Donald Trump’s big victories in the Mid-Atlantic primaries don’t represent quite the end of the ballgame—but they come damn close.
And now Donald Trump’s many and fierce opponents in the Republican Party and the conservative movement face the hour of decision. Trump looks ever more certain to be the party nominee. Yet not perhaps since George McGovern in 1972 has a presumptive nominee so signally failed to carry the most committed members of his party with him.
So what happens now to those who regard themselves as party thought-leaders? Do they submit? Or do they continue to resist?
Resistance now means something more—and more dangerous—than tapping out #NeverTrump on Twitter. It means working to defeat Trump even knowing that the almost certain beneficiary will be Hillary Clinton.