This is a story of pride, prescience, and mild panic among the economy's keepers at the eve of this generation's worst recession
It was the end of the world as we knew it, and the Fed was feeling fine.
Okay, that's not really fair. The transcript of the Federal Reserve's 2007 meetings, months before the economy entered its worst recession since the Great Depression, reveal an institution far from oblivious, with a few notable exceptions. They just didn't quite understand the labyrinthine web of financial interconnections until it was too late.
Back in 2007, the credit crunch that became the Great Recession started when financial institutions realized it might not have been a good idea to loan money to people who couldn't pay you back. But with the economy roaring to new heights, the Fed wasn't in crisis mode -- yet. Panic in the financial markets certainly wasn't good news, but the Fed had managed to make it through similar panics in 1987, when the stock market fell almost a quarter in one day, and in 1998, when hedge fund Long-Term Capital Management nearly brought down the financial system, without the real economy suffering any harm. This time didn't need to be different. And, to be fair, the Fed was well aware of the risks piling up in the financial system as the clock ticked down to Lehman. It didn't even really make any big mistakes in 2007; those came later. So while it's easy to mock the Fed for saying Bear Stearns and Countrywide didn't have too much trouble getting liquidity in August 2007 ... but it was true at the time! They only ran into problems, the kind that drove them into bankruptcy and/or mergers, later.
Below are the six most revealing passages from the Fed's pre-crisis meetings, with a key sentence of each quote underlined. Beyond the inflation hawks who managed to see price increases under every rock, they were mostly right in their analyses. They just weren't right enough. Or quickly enough.
Ben Bernanke, August 10, 2007:
Our goal is to provide liquidity not to support asset prices per se in any way. My understanding of the market's problem is that price discovery has been inhibited by the illiquidity of the subprime-related assets that are not trading, and nobody knows what they're worth, and so there's a general freeze-up. The market is not operating in a normal way. The idea of providing liquidity is essentially to give the market some ability to do the appropriate repricing it needs to do and to begin to operate more normally. So it's a question of market functioning, not a question of bailing anybody out.
This is what a central banker says when things start to hit the fan. The day prior, French bank BNP Paribas had sent the financial world into a frenzy when it announced it wouldn't let investors cash out of two of its subprime funds, because the bank had no idea what they were worth. Nobody would buy, and when that happens, the "price" is pretty much zero. But as Bob Peston of the BBC pointed out at the time, the scariest bit was that BNP Paribas itself didn't want to buy these bonds on the cheap. The bank wasn't sure they weren't totally worthless, too. And if banks (and shadow banks like hedge funds or special investment vehicles) were sitting on top of piles of genuinely worthless bonds, who would want to lend them? Answer: nobody, at least not without top-notch collateral. Hello, credit crunch.
Ben Bernanke, August 16, 2007:
So I wouldn't say that a rate cut is completely off the table, but my own feeling is that we should try to resist a rate cut until it is really very clear from economic data and other information that it is needed. I'd really prefer to avoid giving any impression of a bailout or a put, if we can. Therefore, what I'm going to suggest today is to offer a statement updating our views of the economy that will give some signal about where we think things are going but to stop short today of changing rates.
A week later, things weren't any better. Financial institutions still didn't want to lend to each other except against the best collateral, and markets still didn't exist for subprime securities. Bernanke's dilemma was whether to 1) just expand emergency lending to the banks, or 2) cut interest rates too. But with the real economy humming despite the financial turmoil, Bernanke worried the latter would look too much like a bailout (or a "put" option) for Wall Street.
Bill Dudley, September 18, 2007:
At the same time, this balance sheet pressure and worries about counterparty risk have led to a significant rise in term borrowing rates. Banks that are sellers of funds have shifted to the overnight market to preserve their liquidity, and this shift has starved the term market of funds, pushing those rates higher .... Moreover, the increased reliance by banks on overnight funding increases rollover risk and may limit the willingness of banks to expand their balance sheets to accommodate the deleveraging of the nonbank financial sector.
This is one of the driest descriptions of financial armageddon you'll ever read. Let's translate it into English. Banks knew they were all sitting on top of toxic waste, but nobody knew who was sitting on the most of it -- so they wouldn't lend to each other, except at punitively high rates, for anything longer than a day. But relying on such overnight funding made the banks vulnerable to de facto bank runs, and that made vulnerability made them less likely to keep lending even as shadow banks cut back on lending. In other words, a credit crunch. And less credit just when borrowers most needed it meant more people would eventually go bust ... hurting mortgage bonds even more, and making banks pull back further. Loops don't get much more vicious.
Janet Yellen, December 11, 2007:
The possibilities of a credit crunch developing and of the economy slipping into a recession seem all too real .... I am particularly concerned that we may now be seeing the first signs of spillovers from the housing and financial sectors to the broader economy .... Although I don't foresee conditions in the banking sector getting as bleak as during the credit crunch of the early 1990s, the parallels to those events are striking. Back then, we saw a large number of bank failures in the contraction of the savings and loan sector. In the current situation, most banks are still in pretty good shape. Instead, it is the shadow banking sector-- that is, the set of markets in which a variety of securitized assets are financed by the issuance of commercial paper--that is where the failures have occurred. This sector is all but shut for new business. But bank capital is also an issue. Until the securitization of nonconforming mortgage lending reemerges, financing will depend on the willingness and ability of banks, thrifts, and the GSEs to step in to fill the breach.
The Great Recession was just about to officially begin (although NBER wouldn't announce that until much later), and more members of the Fed were contemplating the Rube Goldberg machine of doom subprime had set off. As Yellen pointed out here, the shadow banking system was already in hibernation at this point, although it wasn't clear whether regular banks would be able to step in the breach and keep things moving. (Spoiler alert: They weren't).
Frederic Mishkin, December 11, 2007:
In particular, there are two scenarios that they go into separately--the housing correction scenario and the credit crunch scenario. I think that there's a very strong possibility those would come together because, if housing prices go down more, that creates a much more serious problem in terms of valuation risk, and a serious problem in valuation risk will mean a further credit market disruption, which then can lead to more macroeconomic risk because it leads to this downward spiral. The real economy gets worse.
These are about the three most prescient sentences you'll find in the Fed transcripts. Miskin was concerned that subprime wasn't, as Bernanke had previously put it, contained, and that a further fall in housing would mean further damage to bank and shadow bank balance sheets, which would make them even less likely to lend. The ultimate danger, as Mishkin pointed out, was that this credit crunch would migrate from the financial to the real economy; that not just banks, but households too, wouldn't be able to borrow. The pyramid of debt that existed in 2007 was like a shark -- it had to keep moving to live. If households spent less because they couldn't borrow more, the economy would slow down, and more people would default on their debts. In other words, exactly what did happen would happen. Of course, it still wasn't clear how precarious the financial sector was beyond the shadow banks. Again, from Mishkin.
You don't like to use the R word, but the probability of recession is, I think, nearing 50 percent, and that really worries me very much. I also think that there's even a possibility that a recession could be reasonably severe, though not a disaster. Luckily all of this has happened with an economy that was pretty strong and with banks having good balance sheets; otherwise it could really be a potential disaster.
Richard Fisher, December 11, 2007:
I'd like to address the inflation situation more thoroughly, Mr. Chairman. The CEO of Wal-Mart USA said that, for the first time in his career at that firm, they have approved a plan in which purchase costs will increase 3 percent in '08. He hadn't seen that before in his experience and said, "I'm totally used to deflation. Deflation is finished." In terms of the suppliers to Wal-Mart, this was verified. I think on food prices we have to be extremely careful. Frito-Lay is seeking a 51⁄2 percent price increase for next year. Wal-Mart has acquiesced.
No, I didn't make this one up. And yes, just as the biggest deflationary spiral in 80 years was about to hit the economy, Fisher was worried about inflation. And he was worried about inflation, because ... Frito-Lay was thinking about increasing prices 5.5 percent the following year. This is not a joke. Well, it is a joke, but, again, not one that I made up.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
On both sides of the Atlantic—in the United Kingdom and the United States—political parties are realigning and voters’ allegiances are shifting.
When United Kingdom voters last week narrowly approved a referendum to leave the European Union, they underscored again how an era of unrelenting economic and demographic change is shifting the axis of politics across much of the industrialized world from class to culture.
Contrary to much initial speculation, the victory for the U.K. leave campaign didn’t point toward victory in the U.S. presidential election for Donald Trump, who is voicing very similar arguments against globalization and immigration; The British results, in fact, underscored the obstacles facing his agenda of defensive nationalism in the vastly more diverse U.S. electorate.
But the Brexit referendum did crystallize deepening cultural fault lines in U.K. politics that are also likely to shape the contest between Trump and Hillary Clinton. In that way, the results prefigure both a continuing long-term realignment in the electoral base of each American party—and a possible near-term reshuffle of the tipping-point states in presidential politics.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
They say religious discrimination against Christians is as big a problem as discrimination against other groups.
Many, many Christians believe they are subject to religious discrimination in the United States. A new report from the Public Religion Research Institute and Brookings offers evidence: Almost half of Americans say discrimination against Christians is as big of a problem as discrimination against other groups, including blacks and minorities. Three-quarters of Republicans and Trump supporters said this, and so did nearly eight out of 10 white evangelical Protestants. Of the latter group, six in 10 believe that although America once was a Christian nation, it is no longer—a huge jump from 2012.
Polling data can be split up in a million different ways. It’s possible to sort by ethnicity, age, political party, and more. The benefit of sorting by religion, though, is that it highlights people’s beliefs: the way their ideological and spiritual convictions shape their self-understanding. This survey suggests that race is not enough to explain the sense of loss some white Americans seem to feel about their country, although it’s part of the story; the same is true of age, education level, and political affiliation. People’s beliefs seem to have a distinctive bearing on how they view changes in American culture, politics, and law—and whether they feel threatened. No group is more likely to express this fear than conservative Christians.
University leaders and observers discuss the intersection of student protests, free speech and academic freedom.
In a Thursday debate titled “Academic Freedom, Safe Spaces, Dissent, and Dignity,” faculty or administrators from Yale, Wesleyan, Mizzou, and the University of Chicago discussed last semester’s student protests and their intersection with free speech. They shared the stage at the Aspen Ideas Festival, co-hosted by the Aspen Institute and The Atlantic, with Jonathan Greenblatt of the Anti-Defamation League; Kirsten Powers, author of The Silencing: How the Left Is Killing Free Speech; and Greg Lukianoff, who leads the Foundation for Individual Rights in Education.
My colleague Jeffrey Goldberg was the moderator.
The most interesting exchange involved Stephen Carter, a law professor at Yale, and Michael S. Roth, the president of Wesleyan University.
People in Great Britain felt their leaders weren’t treating them fairly. Politicians in the U.S. should take note.
Britain’s Brexit vote has shocked the political elites of both the U.S. and Europe. The vote wasn’t just about the EU; in fact, polls before the referendum consistently showed that Europe wasn’t top on voters’ lists of concerns. But on both sides of the Atlantic Ocean, large numbers of people feel that the fundamental contracts of capitalism and democracy have been broken. In a capitalist economy, citizens tolerate rich people if they share in the wealth, and in a democracy, they give their consent to be governed if those governing do so in their interest. The Brexit vote was an opportunity for people to tell elites that both promises have been broken. The most effective line of the Leave campaign was “take back control.” It is also Donald Trump’s line.
In an era fixated with science, technology, and data, the humanities are in decline. They’re more vital than ever.
Earlier this month, the Washington Post journalist Jeff Guo wrote a detailed account of how he’d managed to maximize the efficiency of his cultural consumption. “I have a habit that horrifies most people,” he wrote. “I watch television and films in fast forward … the time savings are enormous. Four episodes of Unbreakable Kimmy Schmidt fit into an hour. An entire season of Game of Thrones goes down on the bus ride from D.C. to New York.”
Guo’s method, which he admits has ruined his ability to watch TV and movies in real time, encapsulates how technology has allowed many people to accelerate the pace of their daily routines. But is faster always better when it comes to art? In a conversation at the Aspen Ideas Festival, co-sponsored by the Aspen Institute and The Atlantic, Drew Gilpin Faust, the president of Harvard University, and the cultural critic Leon Wieseltier agreed that true study and appreciation of the humanities is rooted in slowness—in the kind of deliberate education that can be accrued over a lifetime. While this can seem almost antithetical at times to the pace of modern life, and as subjects like art, philosophy, and literature face steep declines in enrollment at academic institutions in the U.S., both argued that studying the humanities is vital for the ways in which it teaches us how to be human.
As it’s moved beyond the George R.R. Martin novels, the series has evolved both for better and for worse.
Well, that was more like it. Sunday night’s Game of Thrones finale, “The Winds of Winter,” was the best episode of the season—the best, perhaps, in a few seasons. It was packed full of major developments—bye, bye, Baelor; hello, Dany’s fleet—but still found the time for some quieter moments, such as Tyrion’s touching acceptance of the role of Hand of the Queen. I was out of town last week and thus unable to take my usual seat at our Game of Thrones roundtable. But I did have some closing thoughts about what the episode—and season six in general—told us about how the show has evolved.
Last season, viewers got a limited taste—principally in the storylines in the North—of how the show would be different once showrunners Benioff and Weiss ran out of material from George R.R. Martin’s novels and had to set out on their own. But it was this season in which that exception truly became the norm. Though Martin long ago supplied Benioff and Weiss with a general narrative blueprint of the major arcs of the story, they can no longer rely on the books scene by scene. Game of Thrones is truly their show now. And thanks to changes in pacing, character development, and plot streamlining, it’s also a markedly different show from the one we watched in seasons one through four—for the worse and, to some degree, for the better.
American-Indian cooking has all the makings of a culinary trend, but it’s been limited by many diners’ unfamiliarity with its dishes and its loaded history.
DENVER—In 2010, the restaurateur Matt Chandra told The Atlantic that the Native American restaurant he and business partner Ben Jacobs had just opened would have 13 locations “in the near future.” But six years later, just one other outpost of their fast-casual restaurant, Tocabe, is up and running.
In the last decade, at least a handful of articles predicted that Native American food would soon see wider reach and recognition. “From the acclaimed Kai restaurant in Phoenix to Fernando and Marlene Divina's James Beard Award-winning cookbook, Foods of the Americas, to the White Earth Land Recovery Project, which sells traditional foods like wild rice and hominy, this long-overlooked cuisine is slowly gaining traction in the broader culinary landscape,” wrote Katie Robbins in her Atlantic piece. “[T]he indigenous food movement is rapidly gaining momentum in the restaurant world,” proclaimed Mic in the fall of 2014. This optimism sounds reasonable enough: The shift in the restaurant world toward more locally sourced ingredients and foraging dovetails nicely with the hallmarks of Native cuisine, which is often focused on using local crops or herds. Yet while there are a few Native American restaurants in the U.S. (there’s no exact count), the predicted rise hasn’t really happened, at least not to the point where most Americans are familiar with Native American foods or restaurants.