Rubio's call for a single mandate for the Federal Reserve is a dangerous, and potentially disastrous, idea. Unless that single mandate is targeting nominal GDP instead of inflation.
Marco Rubio wants to be president, and unfortunately for him that means he's supposed to have an opinion about everything. I say unfortunately because Rubio has had a hard enough time figuring out the age of the earth, let alone one of the great mysteries like what the Fed should be doing now. The latter came up during Rubio's acceptance speech at the Jack Kemp foundation, and, as Dave Weigel of Slate reports, it did not go well. Hey, he's not a central banker, man.
A long time ago in an administration far, far away, the Republicans were the party of Milton Friedman. It was 2004. As Paul Krugman points out, then-chairman of the Council of Economic Advisers Greg Mankiw advocated aggressive monetary policy as a way to mitigate recessions. This was economic boilerplate, but it was only boilerplate because of Friedman. After the Great Depression, economists didn't think central banks could do much to revive the economy if interest rates fell to zero -- the so-called liquidity trap -- and monetary policy consequently took a backseat to fiscal policy when it came to demand management. Friedman reversed this. He and Anna Schwarz argued the Great Depression was only so great because the Fed's inaction made it so. In other words, central banks were only powerless if they thought they were. They could do plenty, even in a liquidity trap, if they just printed money and promised to keep printing money -- what we rather prosaically call "quantitative easing" nowadays. It was a message conservatives could, and did, love. The government didn't need to spend more to stabilize the economy during a downturn as long as the Fed did its job.
And then the Great Recession happened.
With interest rates stuck at zero and the economy stuck in a growth slump, we're very much back in Friedman's world. But now conservatives aren't so sure about that "aggressive monetary policy" thing anymore. Zero interest rates just seem wrong, and quantitative easing must be a big government bailout on the road to Zimbabwe -- at least that's what they've told themselves, despite stubbornly low inflation. Of course, some conservatives claim inflation is "really" much higher than the government says, but, as Ramesh Ponnuru of National Review points out, this conspiracy theory doesn't withstand much more than two seconds of scrutiny.
This paranoid style in monetary policy has inspired a rather odd political crusade -- the crusade against the Fed's dual mandate. Most central banks are only tasked with worrying about inflation, but the Fed is tasked with worrying about inflation and unemployment. (Or, in Fed-speak, fostering the maximum level of employment consistent with price stability). This has become a bête noire for conservatives, because they think that were it not for the Fed caring about unemployment -- the horror! -- then it wouldn't have expanded its balance sheet so much, and that this expanded balance sheet will inevitably mean higher inflation down the road. Apparently Marco Rubio is one of these conservatives who sees the stagflationary 1970s around every corner. Here's what he said to say about the Fed.
Sound monetary policy would also encourage middle class job creation. The arbitrary way in which interest rates and our currency are treated is yet another cause of unpredictability injected into our economy. The Federal Reserve Board should publish and follow a clear monetary rule -- to provide greater stability about prices and what the value of a dollar will be over time.
Translation: Repeal the dual mandate and replace it with a single mandate for inflation only. This is all kinds of uninformed. As we have pointed out before, inflation has been lower with over four times less variance since Congress gave the Fed its dual mandate in 1978. And with inflation mostly undershooting its 2 percent target since Lehman failed, it's not as if the Fed even needed the dual mandate to justify easing -- a sole inflation mandate would have been enough.
But Rubio is right that the Fed needs a better, clearer monetary rule nowadays. That's not to say that Fed policy has been arbitrary, but just that its rule needs some modernizing. For most of the so-called Great Moderation, the Fed followed something close to a Taylor rule, setting policy based on inflation and unemployment, and it served the Fed well. Greg Mankiw has his own simple version of a Taylor rule, which Paul Krugman tweaked slightly, that gives us a good idea of how the Fed thought then, as you can see below.
You can see why the Great Moderation gave way to the Great Recession. Our Taylor rule says the Fed should have made interest rates negative in late 2008, but the Fed can't make interest rates negative. Well, at least not nominal rates. The Fed can increase inflation, which reduces real rates, to get borrowing costs to where they "should" be -- which is what Ben Bernanke has done, in fits and starts, the past four years. You can see all these fits and starts in the chart below that compares our same Taylor rule to Fed policy since 2006. It's not easy to get real rates down to -7 percent.
There have been far too many fits and not nearly enough starts since 2008. Yes, the Fed tried unconventional easing in late 2008, early 2009, late 2010, late 2011 and late 2012, but it should have been easing this whole time. The Taylor rule has been negative this whole time, which means that the Fed should have been cutting interest rates, and cutting them a lot, this whole time. Instead, we got zero rates. Because inflation hasn't been that far off target, Bernanke has had a hard time convincing the rest of the FOMC to go along with quantitative easing -- so easing has been far less quantitative than the situation calls for. In other words, policy hasn't quite been arbitrary as much as ad hoc, with the unhappy result being an era of tight money.
Imagine the Fed had a single mandate, but not for inflation. Imagine instead the Fed had a single mandate for the total size of the economy, which goes by the unwieldy name of nominal GDP (NGDP). During the Great Moderation, NGDP grew about 5 percent a year, but it's only grown about 2.85 percent a year since 2008. If the Fed had an NGDP target of 5 percent a year, and was supposed to make up for any over-or-undershooting, it would have been aggressively easing the entire time since 2008. It's a dual mandate that doesn't get confused by low inflation and low growth.
For centuries, philosophers and theologians have almost unanimously held that civilization as we know it depends on a widespread belief in free will—and that losing this belief could be calamitous. Our codes of ethics, for example, assume that we can freely choose between right and wrong. In the Christian tradition, this is known as “moral liberty”—the capacity to discern and pursue the good, instead of merely being compelled by appetites and desires. The great Enlightenment philosopher Immanuel Kant reaffirmed this link between freedom and goodness. If we are not free to choose, he argued, then it would make no sense to say we ought to choose the path of righteousness.
Today, the assumption of free will runs through every aspect of American politics, from welfare provision to criminal law. It permeates the popular culture and underpins the American dream—the belief that anyone can make something of themselves no matter what their start in life. As Barack Obama wrote in The Audacity of Hope, American “values are rooted in a basic optimism about life and a faith in free will.”
Hillary Clinton wrote something for The Toast today. Are you sobbing yet?
Either you’ll immediately get why this is crazy, or you won’t: Hillary Clinton wrote a thing for The Toast today.
Are you weeping? Did your heart skip a beat? Maybe your reaction was, “What. Whaaaat. WHAT,” or “Aaaaaaahhhhhhh!!!” or “OH MY GOD,” or simply “this is too much goodbye I'm dead now.”
Perhaps your feelings can only be captured in GIF form, as was the case for someone commenting on Clinton’s post under the name Old_Girl:
Reader comments like the ones above are arguably the best part of Clinton’s post, because they highlight just how meaningful hearing directly from Clinton is to The Toast’s community of readers. The Toast is a small but beloved feminist website known for its quirky literary humor. It announced last month it couldn’t afford to continue operating. Friday is its last day of publication.
George Will is denouncing a GOP that has been ailing for years, but quitting won’t help—an American political party can only be reformed from within.
This past weekend, George Will revealed that he had formally disaffiliated himself from the Republican Party, switching his Maryland voter registration to independent. On Fox News Sunday, the conservative pundit explained his decision: "After Trump went after the 'Mexican' judge from northern Indiana then [House Speaker] Paul Ryan endorsed him, I decided that in fact this was not my party anymore.” For 40 years, George Will defined and personified what it meant to be a thoughtful conservative. His intellect and authority inspired a generation of readers and viewers, myself very much among them.
His departure represents a powerful image of divorce between intellectual conservatism and the new Trump-led GOP. Above all, it raises a haunting question for the many other Republicans and conservatives repelled by the looming nomination of Donald Trump as the Republican candidate for president of the United States: What will you do?
What percentage graduated from high school and enrolled within a year at a four year institution where they live on campus?
Who are today’s college students?
The answer surprises most people who attended four year universities, according to Jamie Merisotis, President and CEO of Lumina Foundation. Addressing audiences, like the one he spoke to Friday at The Aspen Ideas Festival, co-hosted by the Aspen Institute and The Atlantic, he frequently poses this question: “What percentage of students in American higher education today graduated from high school and enrolled in college within a year to attend a four year institution and live on campus?”
Most people guess “between forty and sixty percent,” he said, whereas “the correct answer is five percent.” There is, he argued, “a real disconnect in our understanding of who today’s students are. The influencers––the policy makers, the business leaders, the media––have a very skewed view of who today’s students are.”
“This western-front business couldn’t be done again.”
On this first day of July, exactly 100 years ago, the peoples of the British Empire suffered the greatest military disaster in their history. A century later, “the Somme” remains the most harrowing place-name in the annals not only of Great Britain, but of the many former dependencies that shed their blood on that scenic river. The single regiment contributed to the First World War by the island of Newfoundland, not yet joined to Canada, suffered nearly 100 percent casualties that day: Of 801 engaged, only 68 came out alive and unwounded. Altogether, the British forces suffered more than 19,000 killed and more than 38,000 wounded: almost as many casualties in one day as Britain suffered in the entire disastrous battle for France in May and June 1940, including prisoners. The French army on the British right flank absorbed some 1,600 casualties more.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
There needs to be more nuanced language to describe the expanding demographic of unmarried Americans.
In 1957, a team of psychology professors at the University of Michigan released the results of a survey they had conducted—an attempt to reflect Americans’ attitudes about unmarried people. When it came to the group of adults who remained single by choice, 80 percent of the survey’s respondents—reflecting the language used by the survey’s authors—said they believed that the singletons remained so because they must be “immoral,” “sick,” or “neurotic.”
It’s amazing, and reassuring, how much has changed in such a relatively narrow slice of time. Today, certainly, marriage remains a default economic and social arrangement, particularly after having been won as a right for same-sex couples; today, certainly, those who do not marry still face some latent social stigmas (or, at the very least, requests to explain themselves). But the regressive language of failed morality and psychological pathology when it comes to singledom? That has, fortunately, been replaced by more permissive attitudes.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
The trend helps explain Trump and Brexit. What’s next?
On Wednesday, Facebook made an announcement that you’d think would only matter to Facebook users and publishers: It will modify its News Feed algorithm to favor content posted by a user’s friends and family over content posted by media outlets. The company said the move was not about privileging certain sources over others, but about better “connecting people and ideas.”
But Richard Edelman, the head of the communications marketing firm Edelman, sees something more significant in the change: proof of a new “world of self-reference” that, once you notice it, helps explain everything from Donald Trump’s appeal to Britain’s vote to exit the European Union. Elites used to possess outsized influence and authority, Edelman notes, but now they only have a monopoly on authority. Influence largely rests with the broader population. People trust their peers much more than they trust their political leaders or news organizations.