Recoveries have been getting weaker and weaker because that's how the Fed wants them
It's time to talk about everybody's least favorite Davos buzzword -- New Normal.
With GDP unexpectedly contracting 0.1 percent in the fourth quarter of 2012 (though the private sector mostly kept up, despite the obstacles we've thrown in its way), it's enough to make you wonder if this time really is different. In other words, has the economy settled into a, well, new normal of slower growth?
If it has, it's not quite new, at least when it comes to recoveries. As you can see in this Minneapolis Fed chart of job gains following recessions, something changed after 1981. Recoveries went from being V-shaped affairs characterized by rapid bouncebacks in employment to U-shaped ones better described as nasty, brutish, and long.
(Note: I excluded the recovery from the 1980 recession, because the double-dip in 1981 cut it short).
The story of the jobless recovery is one of what the Fed isn't doing. As Paul Krugman points out, recessions have become post-(or perhaps pre-) modern. Through the 1980s, postwar recessions happened when the Fed decided to raise rates to head off inflation, and recoveries happened when the Fed decided things had tamed down enough to lower rates. But now recessions happen when bubbles burst, with financial deregulation and the global savings glut making these more of a recurring feature of our economy, and the Fed hasn't been able to cut interest rates enough to generate strong post-crash recoveries. Or maybe it hasn't wanted to.
Here's a stupid question. Why have interest rates and inflation mostly been falling for the past 30 years? In other words if the Fed has been de facto, and later de jure, targeting inflation for most of this period (and it has), why has inflation been on a down trend (and it has)? As you can see in the chart below, core PCE inflation, which excludes food and energy costs, fell substantially from the Reagan recovery through the bursting tech bubble, and has more or less held steady since, though a bit more on the less side recently.
Say hello to "opportunistic disinflation." Okay, let's translate this from Fed-ese. Remember, the Fed is supposed to target 2 percent inflation, meaning it raises rates when prices rise by more than that much and lowers them once the economy's cooled off enough, but it wasn't always so. Back in the mid-1980s, inflation was hovering around 4 percent, a major achievement following the stagflation of the previous decade, but the Fed wanted it to go lower -- here's the crucial bit -- without taking the blame for it. The Volcker Fed had come in for quite a bit of abuse when it whipped inflation at the expense of the severe 1981-82 downturn, and the Fed seems to have learned it was better not to leave its fingerprints on the business cycle.
In other words, Let recessions do their dirty work for them.
It's not hard for central bankers to get what they want without doing anything, as long as what they want is less inflation (and that's almost always what central bankers want). They just have to wait for a recession to come along ... and then keep waiting until inflation falls to where they want it. Then, once prices have declined enough for their taste, they cut rates (or buy bonds) to stabilize inflation at this new, lower level. But it's one thing to stabilize inflation at a lower level; it's another to keep it there. The Fed has to raise rates faster than it otherwise would during the subsequent recovery to keep inflation from going back to where it was before the recession. It's what the Fed calls "opportunistic disinflation," and it's hard to believe this wasn't their strategy looking at falling inflation the previous few decades. Not that we have to guess. Fed president Edward Boehene actually laid out this approach in 1989, and Fed governor Laurence Meyer endorsed the idea of "reducing inflation cycle-to-cycle" in a 1996 speech -- the same year the Wall Street Journal leaked an internal Fed memo outlining the policy.
In short: Recoveries have been jobless, because that's how the Fed likes them.
But it gets worse. Pushing inflation progressively lower means recoveries get progressively weaker, since the Fed has to choke off inflation, and hence the recovery, at lower and lower levels. Now, to be fair, the Fed, and Ben Bernanke in particular, have awoken to the dangers of this approach. The danger, of course, is that the Fed gets in a situation where short-term rates are stuck at zero, but the economy stays stuck in a slump. Sound familiar? Bernanke realized this was a threat in 2002 when the economy was flirting with deflation despite 1.34 interest rates, and vowed not to let it happen here. (Remember, "disinflation" means falling inflation, and "deflation" means negative inflation).
The Fed, of course, did let it happen here. But it didn't let prices actually start to fall, which would make debt and borrowing more expensive at the worst possible moment, due to the Fed's bond-buying and to wages that are sticky downwards. Bernanke got the Fed to accept that opportunistic disinflation had gone too far with QE1 and QE2, but it's not clear that he's gotten them to give up on the idea altogether. Core inflation has settled in below 2 percent, and the Fed's economic projections don't show it rising above that level anytime soon. That's pushed nominal GDP growth -- the growth of the total size of the economy -- down to 4 percent for each of the past three years; a low level the Fed is apparently comfortable with. Bernanke seems to be trying to shift the consensus towards undoing some of this disinflation -- unlike previous rounds of bond-buying, QE3 was aimed at lowering unemployment, and not stopping lower prices, while the Evans rule explicitly says the Fed will tolerate inflation up to 2.5 percent -- but there's been no shift in the data so far. The Fed needs to realize there is no try when it comes to reflation. It has to promise to do whatever it takes.
The new normal doesn't have to be new or normal if the Fed doesn't want it to be.
Dean of Students John Ellison gets an A for initiative, a B-minus for execution, and extra-credit for stoking a useful debate.
When I was a heretical student at a Catholic high school deciding where to apply to college, I thrilled at the prospect of an educational institution where free inquiry would reign supreme and forceful debate would never be hemmed in by dogma.
A letter like the one that University of Chicago Dean of Students John Ellison sent last week to incoming first-year students––reminding them of the school’s “commitment to freedom of inquiry and expression," and affirming that those admitted to it “are encouraged to speak, write, listen, challenge, and learn, without fear of censorship”––would have struck me as a glorious affirmation: that robust intellectual communities truly did exist; that I would finally be free to follow my brain; that college would be a crucible that tested the strength of all my beliefs.
The San Francisco quarterback has been attacked for refusing to stand for the Star Spangled Banner—and for daring to criticize the system in which he thrived.
It was in early childhood when W.E.B. Du Bois––scholar, activist, and black radical––first noticed The Veil that separated him from his white classmates in the mostly white town of Great Barrington, Massachusetts. He and his classmates were exchanging “visiting cards,” invitations to visit one another’s homes, when a white girl refused his.
“Then it dawned upon me with a certain suddenness that I was different from the others; or like, mayhap, in heart and life and longing, but shut out from their world by a vast veil. I had thereafter no desire to tear down that veil, to creep through; I held all beyond it in common contempt, and lived above it in a region of blue sky and great wandering shadows,” Du Bois wrote in his acclaimed essay collection, The Souls of Black Folk. “That sky was bluest when I could beat my mates at examination-time, or beat them at a foot-race, or even beat their stringy heads.”
A Hillary Clinton presidential victory promises to usher in a new age of public misogyny.
Get ready for the era of The Bitch.
If Hillary Clinton wins the White House in November, it will be a historic moment, the smashing of the preeminent glass ceiling in American public life. A mere 240 years after this nation’s founding, a woman will occupy its top office. America’s daughters will at last have living, breathing, pantsuit-wearing proof that they too can grow up to be president.
A Clinton victory also promises to usher in four-to-eight years of the kind of down-and-dirty public misogyny you might expect from a stag party at Roger Ailes’s house.
You know it’s coming. As hyperpartisanship, grievance politics, and garden-variety rage shift from America’s first black commander-in-chief onto its first female one, so too will the focus of political bigotry. Some of it will be driven by genuine gender grievance or discomfort among some at being led by a woman. But in plenty of other cases, slamming Hillary as a bitch, a c**t (Thanks, Scott Baio!), or a menopausal nut-job (an enduringly popular theme on Twitter) will simply be an easy-peasy shortcut for dismissing her and delegitimizing her presidency.
The talk-radio host claims that he never took Donald Trump seriously on immigration. He neglected to tell his immigration obsessed listeners.
For almost a decade, I’ve been angrily documenting the way that many right-wing talk-radio hosts betray the rank-and-file conservatives who trust them for information. My late grandmother was one of those people. She deserved better than she got. With huge platforms and massive audiences, successful hosts ought to take more care than the average person to be truthful and avoid misinforming listeners. Yet they are egregiously careless on some days and willfully misleading on others.
And that matters, as we’ll come to see.
Rush Limbaugh is easily the most consequential of these hosts. He has an audience of millions. And over the years, parts of the conservative movement that ought to know better, like the Claremont Institute, have treated him like an honorable conservative intellectual rather than an intellectually dishonest entertainer. The full cost of doing so became evident this year, when a faction of populists shaped by years of talk radio, Fox News, and Breitbart.com picked Donald Trump to lead the Republican Party, a choice that makes a Hillary Clinton victory likely and is a catastrophe for movement conservatism regardless of who wins.
In its early days, the first English settlement in America had lots of men, tobacco, and land. All it needed was women.
“First comes love, then comes marriage,” the old nursery rhyme goes, but historically, first came money. Marriage was above all an economic transaction, and in no place was this more apparent than in the early 1600s in the Jamestown colony, where a severe gender imbalance threatened the fledgling colony’s future.
The men of Jamestown desperately wanted wives, but women were refusing to immigrate. They had heard disturbing reports of dissension, famine, and disease, and had decided it simply wasn’t worth it. Consequently, barely a decade after its founding in 1607, Jamestown was almost entirely male, and because these men were unable to find wives, they were deserting the colony in droves.
An immediate influx of women was needed to save the floundering colony; its leaders suggested putting out an advertisement targeting wives. The women who responded to this marital request and agreed to marry unknown men in an unfamiliar land were in a sense America’s first mail-order brides.
Practices meant to protect marginalized communities can also ostracize those who disagree with them.
Last week, the University of Chicago’s dean of students sent a welcome letter to freshmen decrying trigger warnings and safe spaces—ways for students to be warned about and opt out of exposure to potentially challenging material. While some supported the school’s actions, arguing that these practices threaten free speech and the purpose of higher education, the note also led to widespread outrage, and understandably so. Considered in isolation, trigger warnings may seem straightforwardly good. Basic human decency means professors like myself should be aware of students’ traumatic experiences, and give them a heads up about course content—photographs of dead bodies, extended accounts of abuse, disordered eating, self-harm—that might trigger an anxiety attack and foreclose intellectual engagement. Similarly, it may seem silly to object to the creation of safe spaces on campus, where members of marginalized groups can count on meeting supportive conversation partners who empathize with their life experiences, and where they feel free to be themselves without the threat of judgment or censure.
Which is a different way of asking: Can a bot commit libel?
Facebook set a new land-speed record for situational irony this week, as it fired the people who kept up its “Trending Topics” feature and replaced them with an algorithm on Friday, only to find the algorithm promoting completely fake news on Sunday.
Rarely in recent tech history has a downsizing decision come back to bite the company so publicly and so quickly.
Education experts offer their thoughts on how—if at all—schools should assign, grade, and use take-home assignments.
This is the third installment in our series about school in a perfect world. Read previous entries here and here.
We asked prominent voices in education—from policy makers and teachers to activists and parents—to look beyond laws, politics, and funding and imagine a utopian system of learning. They went back to the drawing board—and the chalkboard—to build an educational Garden of Eden. We’re publishing their answers to one question each day this week. Responses have been lightly edited for clarity and length.
Today’s assignment: The Homework. Will students have homework?
Rita Pin Ahrens, thedirector of education policy for the Southeast Asia Resource Action Center
Homework is absolutely necessary for students to demonstrate that they are able to independently process and apply their learning. But who says homework has to be the same as it has been? Homework might include pre-reading in preparation for what will be covered in class that day, independent research on a student-chosen topic that complements the class curriculum, experiential learning through a volunteer activity or field trip, or visiting a website and accomplishing a task on it. The structure will be left to the teachers to determine, as best fits the learning objective, and should be graded—whether by the teacher or student. Students will be held accountable for their homework and understand that it is an integral part of the learning process.
We asked education experts how much time they think kids should spend in class. Here's what they had to say.
Nothing is perfect, but what if it could be?
Back-to-school season is in full swing, and despite the crispness of new notebook paper and the allure of Friday night lights, it’s hard to ignore the serious inequities, debates, and issues currently hampering America’s education system. Students will walk down hallways they haven’t seen since June with questions of segregation raging around them. Teachers will greet their pupils as public-school systems around the country are flailing. And administrators will continue on as innovative ideas about how best to reach learners emerge. And so, it’s no surprise that many are entering the school year with both aspiration and trepidation.
With that in mind, we asked a variety of prominent voices in education—from policy makers and teachers to activists and parents—what their vision of a perfect school system would be. We asked them to look beyond laws, politics, and funding to imagine a utopian system of learning. We wanted to know how these men and women would critically examine the most macro and micro aspects of school and reform these elements in a perfect world. They went back to the drawing board—and the chalkboard—to build their educational Garden of Eden. We’ll be publishing their answers to one question every day this week. The responses have been lightly edited for clarity and length.
What looks at first glance like an opening up of possibilities is actually an attack on the human imagination.
You might not like what I’m about to say about the multiverse. But don’t worry; you’ve already had your revenge. If there are an infinite number of parallel universes, there will be any number of terrible dictatorships, places where life has become very difficult for people who like to string words together. Somewhere out there, there’s a society in which every desperate little essay like this one comes with a tiny, unremarkable button: push it, and the author will be immediately electrocuted to death.
Maybe your hate is more visceral—you already know I’ll die some day, but you want to see it happen; you need to see me groveling. You can if you want. Fly upwards from the plane of our solar system, keep on going, through the endless huddles of galaxies, never forgetting your purpose, until space and time run out altogether. Eventually you’ll find yourself in another universe, on a damp patch of grass and broken concrete, unwatched by whatever local gang or galactic empire rules the city rising in foggy shapes beyond the marshes. There, you’ll see a creature strangely similar to yourself, beating me to death with whatever bits of scrap are lying around.