Recoveries have been getting weaker and weaker because that's how the Fed wants them
It's time to talk about everybody's least favorite Davos buzzword -- New Normal.
With GDP unexpectedly contracting 0.1 percent in the fourth quarter of 2012 (though the private sector mostly kept up, despite the obstacles we've thrown in its way), it's enough to make you wonder if this time really is different. In other words, has the economy settled into a, well, new normal of slower growth?
If it has, it's not quite new, at least when it comes to recoveries. As you can see in this Minneapolis Fed chart of job gains following recessions, something changed after 1981. Recoveries went from being V-shaped affairs characterized by rapid bouncebacks in employment to U-shaped ones better described as nasty, brutish, and long.
(Note: I excluded the recovery from the 1980 recession, because the double-dip in 1981 cut it short).
The story of the jobless recovery is one of what the Fed isn't doing. As Paul Krugman points out, recessions have become post-(or perhaps pre-) modern. Through the 1980s, postwar recessions happened when the Fed decided to raise rates to head off inflation, and recoveries happened when the Fed decided things had tamed down enough to lower rates. But now recessions happen when bubbles burst, with financial deregulation and the global savings glut making these more of a recurring feature of our economy, and the Fed hasn't been able to cut interest rates enough to generate strong post-crash recoveries. Or maybe it hasn't wanted to.
Here's a stupid question. Why have interest rates and inflation mostly been falling for the past 30 years? In other words if the Fed has been de facto, and later de jure, targeting inflation for most of this period (and it has), why has inflation been on a down trend (and it has)? As you can see in the chart below, core PCE inflation, which excludes food and energy costs, fell substantially from the Reagan recovery through the bursting tech bubble, and has more or less held steady since, though a bit more on the less side recently.
Say hello to "opportunistic disinflation." Okay, let's translate this from Fed-ese. Remember, the Fed is supposed to target 2 percent inflation, meaning it raises rates when prices rise by more than that much and lowers them once the economy's cooled off enough, but it wasn't always so. Back in the mid-1980s, inflation was hovering around 4 percent, a major achievement following the stagflation of the previous decade, but the Fed wanted it to go lower -- here's the crucial bit -- without taking the blame for it. The Volcker Fed had come in for quite a bit of abuse when it whipped inflation at the expense of the severe 1981-82 downturn, and the Fed seems to have learned it was better not to leave its fingerprints on the business cycle.
In other words, Let recessions do their dirty work for them.
It's not hard for central bankers to get what they want without doing anything, as long as what they want is less inflation (and that's almost always what central bankers want). They just have to wait for a recession to come along ... and then keep waiting until inflation falls to where they want it. Then, once prices have declined enough for their taste, they cut rates (or buy bonds) to stabilize inflation at this new, lower level. But it's one thing to stabilize inflation at a lower level; it's another to keep it there. The Fed has to raise rates faster than it otherwise would during the subsequent recovery to keep inflation from going back to where it was before the recession. It's what the Fed calls "opportunistic disinflation," and it's hard to believe this wasn't their strategy looking at falling inflation the previous few decades. Not that we have to guess. Fed president Edward Boehene actually laid out this approach in 1989, and Fed governor Laurence Meyer endorsed the idea of "reducing inflation cycle-to-cycle" in a 1996 speech -- the same year the Wall Street Journal leaked an internal Fed memo outlining the policy.
In short: Recoveries have been jobless, because that's how the Fed likes them.
But it gets worse. Pushing inflation progressively lower means recoveries get progressively weaker, since the Fed has to choke off inflation, and hence the recovery, at lower and lower levels. Now, to be fair, the Fed, and Ben Bernanke in particular, have awoken to the dangers of this approach. The danger, of course, is that the Fed gets in a situation where short-term rates are stuck at zero, but the economy stays stuck in a slump. Sound familiar? Bernanke realized this was a threat in 2002 when the economy was flirting with deflation despite 1.34 interest rates, and vowed not to let it happen here. (Remember, "disinflation" means falling inflation, and "deflation" means negative inflation).
The Fed, of course, did let it happen here. But it didn't let prices actually start to fall, which would make debt and borrowing more expensive at the worst possible moment, due to the Fed's bond-buying and to wages that are sticky downwards. Bernanke got the Fed to accept that opportunistic disinflation had gone too far with QE1 and QE2, but it's not clear that he's gotten them to give up on the idea altogether. Core inflation has settled in below 2 percent, and the Fed's economic projections don't show it rising above that level anytime soon. That's pushed nominal GDP growth -- the growth of the total size of the economy -- down to 4 percent for each of the past three years; a low level the Fed is apparently comfortable with. Bernanke seems to be trying to shift the consensus towards undoing some of this disinflation -- unlike previous rounds of bond-buying, QE3 was aimed at lowering unemployment, and not stopping lower prices, while the Evans rule explicitly says the Fed will tolerate inflation up to 2.5 percent -- but there's been no shift in the data so far. The Fed needs to realize there is no try when it comes to reflation. It has to promise to do whatever it takes.
The new normal doesn't have to be new or normal if the Fed doesn't want it to be.
Delegates in Cleveland answer a nightmare question: Would they take four more years of Barack Obama over a Hillary Clinton presidency?
CLEVELAND—It was a question no Republican here wanted to contemplate.
The query alone elicited winces, scoffs, and more than a couple threats of suicide. “I would choose to shoot myself,” one delegate from Texas replied. “You want cancer or a heart attack?” cracked another from North Carolina.
Hillary Clinton and Barack Obama have each been objects of near histrionic derision from Republicans for years (decades in Clinton’s case), but never more so than during the four days of the GOP’s national convention. Republicans onstage at Quicken Loans Arena and in the dozens of accompanying events have accused President Obama of literally destroying the country in his eight years in the White House. Speakers and delegates subjected Clinton to even harsher rhetoric, charging her with complicity in death and mayhem and then repeatedly chanting, “Lock her up!” from the convention floor.
Biology textbooks tell us that lichens are alliances between two organisms—a fungus and an alga. They are wrong.
In 1995, if you had told Toby Spribille that he’d eventually overthrow a scientific idea that’s been the stuff of textbooks for 150 years, he would have laughed at you. Back then, his life seemed constrained to a very different path. He was raised in a Montana trailer park, and home-schooled by what he now describes as a “fundamentalist cult.” At a young age, he fell in love with science, but had no way of feeding that love. He longed to break away from his roots and get a proper education.
At 19, he got a job at a local forestry service. Within a few years, he had earned enough to leave home. His meager savings and non-existent grades meant that no American university would take him, so Spribille looked to Europe.
It’s known as a modern-day hub of progressivism, but its past is one of exclusion.
PORTLAND, Ore.— Victor Pierce has worked on the assembly line of a Daimler Trucks North America plant here since 1994. But he says that in recent years he’s experienced things that seem straight out of another time. White co-workers have challenged him to fights, mounted “hangman’s nooses” around the factory, referred to him as “boy” on a daily basis, sabotaged his work station by hiding his tools, carved swastikas in the bathroom, and written the word “nigger” on walls in the factory, according to allegations filed in a complaint to the Multnomah County Circuit Court in February of 2015.
Pierce is one of six African Americans working in the Portland plant whom the lawyer Mark Morrell is representing in a series of lawsuits against Daimler Trucks North America. The cases have been combined and a trial is scheduled for January of 2017.
There’s a special “debut” category for vice-presidential selections who very suddenly find themselves in the world’s media glare.
VP picks who had mounted serious runs for president don’t quite fit this category. They already knew what it was like to handle big audiences and the press. For example: the elder George Bush became Ronald Reagan’s VP candidate in 1980, but only after running against Reagan in the primary campaign. The same was true of Joe Biden, who had run against Barack Obama (and Hillary Clinton) for the nomination in 2008 before becoming Obama’s running mate, and had run 20 years earlier too. In electoral politics, Dick Cheney had gotten only as far as Wyoming’s seat in Congress when George W. Bush picked him in 2000. But Cheney was already internationally known as Gerald Ford’s White House chief of staff and George H.W. Bush’s Secretary of Defense during the Gulf War.
Why Millennials aren’t buying cars or houses, and what that means for the economy
In 2009, Ford brought its new supermini, the Fiesta, over from Europe in a brave attempt to attract the attention of young Americans. It passed out 100 of the cars to influential bloggers for a free six-month test-drive, with just one condition: document your experience online, whether you love the Fiesta or hate it.
Young bloggers loved the car. Young drivers? Not so much. After a brief burst of excitement, in which Ford sold more than 90,000 units over 18 months, Fiesta sales plummeted. As of April 2012, they were down 30 percent from 2011.
Don’t blame Ford. The company is trying to solve a puzzle that’s bewildering every automaker in America: How do you sell cars to Millennials (a k a Generation Y)? The fact is, today’s young people simply don’t drive like their predecessors did. In 2010, adults between the ages of 21 and 34 bought just 27 percent of all new vehicles sold in America, down from the peak of 38 percent in 1985. Miles driven are down, too. Even the proportion of teenagers with a license fell, by 28 percent, between 1998 and 2008.
A crop of books by disillusioned physicians reveals a corrosive doctor-patient relationship at the heart of our health-care crisis.
For someone in her 30s, I’ve spent a lot of time in doctors’ offices and hospitals, shivering on exam tables in my open-to-the-front gown, recording my medical history on multiple forms, having enough blood drawn in little glass tubes to satisfy a thirsty vampire. In my early 20s, I contracted a disease that doctors were unable to identify for years—in fact, for about a decade they thought nothing was wrong with me—but that nonetheless led to multiple complications, requiring a succession of surgeries, emergency-room visits, and ultimately (when tests finally showed something was wrong) trips to specialists for MRIs and lots more testing. During the time I was ill and undiagnosed, I was also in and out of the hospital with my mother, who was being treated for metastatic cancer and was admitted twice in her final weeks.
Narcissism, disagreeableness, grandiosity—a psychologist investigates how Trump’s extraordinary personality might shape his possible presidency.
In 2006, Donald Trump made plans to purchase the Menie Estate, near Aberdeen, Scotland, aiming to convert the dunes and grassland into a luxury golf resort. He and the estate’s owner, Tom Griffin, sat down to discuss the transaction at the Cock & Bull restaurant. Griffin recalls that Trump was a hard-nosed negotiator, reluctant to give in on even the tiniest details. But, as Michael D’Antonio writes in his recent biography of Trump, Never Enough, Griffin’s most vivid recollection of the evening pertains to the theatrics. It was as if the golden-haired guest sitting across the table were an actor playing a part on the London stage.
“It was Donald Trump playing Donald Trump,” Griffin observed. There was something unreal about it.
Fulfilling what might be the Russian autocrat’s dearest wish, Trump has openly questioned whether the U.S. should keep its commitments to NATO.
The Republican nominee for president, Donald J. Trump, has chosen this week to unmask himself as a de facto agent of Russian President Vladimir Putin, a KGB-trained dictator who seeks to rebuild the Soviet empire by undermining the free nations of Europe, marginalizing NATO, and ending America’s reign as the world’s sole superpower.
I am not suggesting that Donald Trump is employed by Putin—though his campaign manager, Paul Manafort, was for many years on the payroll of the Putin-backed former president of Ukraine, Viktor Yanukovych. I am arguing that Trump’s understanding of America’s role in the world aligns with Russia’s geostrategic interests; that his critique of American democracy is in accord with the Kremlin’s critique of American democracy; and that he shares numerous ideological and dispositional proclivities with Putin—for one thing, an obsession with the sort of “strength” often associated with dictators. Trump is making it clear that, as president, he would allow Russia to advance its hegemonic interests across Europe and the Middle East. His election would immediately trigger a wave of global instability—much worse than anything we are seeing today—because America’s allies understand that Trump would likely dismantle the post-World War II U.S.-created international order. Many of these countries, feeling abandoned, would likely pursue nuclear weapons programs on their own, leading to a nightmare of proliferation.
Don Johnson won nearly $6 million playing blackjack in one night, single-handedly decimating the monthly revenue of Atlantic City’s Tropicana casino. Not long before that, he’d taken the Borgata for $5 million and Caesars for $4 million. Here’s how he did it.
Don Johnson finds it hard to remember the exact cards. Who could? At the height of his 12-hour blitz of the Tropicana casino in Atlantic City, New Jersey, last April, he was playing a hand of blackjack nearly every minute.
Dozens of spectators pressed against the glass of the high-roller pit. Inside, playing at a green-felt table opposite a black-vested dealer, a burly middle-aged man in a red cap and black Oregon State hoodie was wagering $100,000 a hand. Word spreads when the betting is that big. Johnson was on an amazing streak. The towers of chips stacked in front of him formed a colorful miniature skyline. His winning run had been picked up by the casino’s watchful overhead cameras and drawn the close scrutiny of the pit bosses. In just one hand, he remembers, he won $800,000. In a three-hand sequence, he took $1.2 million.
Certain gut bacteria have evolved in parallel with apes, so that their family tree perfectly mirrors our own.
Around 10 million years ago, a population of African apes diverged down two paths. One lineage gave rise to gorillas. The other eventually split again, producing one branch that led to humans and another that forked into chimpanzees and bonobos. This is the story of our recent evolutionary past. It’s also the story of some of the microbes in our guts.
We have tens of trillions of bacteria and other microbes in our guts—at least one for each of our own human cells. Some species within this microbiome are passers-by, which we pick up from our food and our environments. But others are much older companions.
Andrew Moeller from the University of California, Berkeley, has found that there are a few groups of human gut bacteria whose history pre-dates humanity. Their ancestors lived in the guts of ancestral apes, and as those ancient animals diverged into modern species, the microbes did, too. In technical terms, they co-speciated. In simpler ones, if you drew out their family tree, you’d get ours for free; you could reconstruct the evolution of apes simply by comparing the right bacteria in their bowels.