Government borrowing doesn't always crowd out private borrowing
Four years after the end of the presidency that must not be named, Republicans are no closer to figuring out what went wrong or what comes next.
Sure, the GOP has decided Bush spent too much, but there's been little other reckoning (outside of wonks like David Frum, Reihan Salam, and Ross Douthat). After all, it's hard to see what fiscal profligacy had to do with stagnant median incomes, rising healthcare and college costs, and a fragile financial system -- and harder still to see what remedies the Republicans have to offer. When it comes to policy, the GOP is stuck in 1980: lower taxes, less regulation, and more drilling for oil are always and everywhere the answer, no matter the question. (No, really).
Even Obama's reelection hasn't been enough to wake the Republicans from their Reagan von Winkle slumber. The GOP has chosen re-branding over rethinking. In other words, they think they have a messenger, not a message, problem -- and that's where Marco Rubio comes in. As Jonathan Chait of New York explains, Rubio offers the party an appealing, young salesman for its same, old policies, immigration aside. It was no accident his response to the State of the Union was so devoid of anything resembling new thinking. It was the point. Indeed, Rubio just rounded up the usual talking points, saying, among other things, that the government was a major cause of the housing bubble (it wasn't), and that Washington needs a balanced budget amendment (it very much does not). These are certainly cringe-worthy mistakes, but Rubio's biggest one is even more fundamental. He doesn't think the government can create jobs, except when it does.
Every dollar our government borrows is money that isn't being invested to create jobs. And the uncertainty created by the debt is one reason why many businesses aren't hiring.
Rubio has fallen victim to one of the classic economic blunders. It's called Say's Law, and it's not, in fact, a law. It's more like a guideline. The idea is that supply creates its own demand, which is true enough during booms, but not so during busts. The underlying logic here -- producing goods gives you the income to buy other goods -- makes sense, but only as long as you don't include money. Then everything falls apart. We'll return to why money is the root of all depressions in a second, but first, let's think about what it would mean if Say's Law were true. It would mean a world where demand can never lag supply; where unemployment is either voluntary or transient (when people switch jobs); and where government spending can never help the economy. After all, public borrowing has to come from somewhere, and a dollar the government borrows is a dollar the private sector doesn't. In other words, government borrowing "crowds out" private borrowing, pushing up interest rates as it competes for funds.
But this is terribly wrong. In the real world, people are out of work because they can't find work, not because they don't want it; the Great Recession has not been a Great Vacation. Supply doesn't always create its own demand, because demand for money might increase. In other words, people might hoard money. Now, "hoard" probably brings to mind people frantically stuffing money into mattresses, but it's a bit different than that today. It means households don't want to spend, and businesses don't want to invest, and banks don't want to lend. There's an excess of desired savings over desired investment -- or, as it's more commonly called, a recession. The Fed can make hoarding less appealing by cutting interest rates to inject money into the economy, but it can't do so now, at least not easily. Interest rates are already at zero, and unconventional money-printing hasn't been quite as effective. In short, the Fed hasn't been able to get us to stop hoarding right now.
That leaves two options: depression or deficits. In other words, either nobody borrows the unborrowed money, or the government does. If nobody does, the economy will contract by as much as isn't borrowed; if the government does, the economy will (at least) stabilize. As Matthew Yglesias of Slate points out, it's easy enough to tell the government is borrowing money that otherwise wouldn't be today, since interest rates have fallen despite big deficits. There has been no crowding out.
But it turns out we are actually all Keynesians now, even Marco Rubio. At least when it comes to military spending. (Though he's hardly alone with this cognitive dissonance). Here's what he told HispanicBusiness.com last September about the upcoming sequester cuts set to hit the Pentagon:
Thousands of jobs in defense-related enterprises have been lost already, with many more projected to go if the sequester crisis is not averted. These defense cuts hurt innovation, medical research and thousands of small businesses who subcontract for defense-related work.
Rubio is actually a pretty ambitious Keynesian! Not only does he think the government can create jobs, but he also thinks those jobs create other jobs -- that is, there's a multiplier on government spending.
On both sides of the Atlantic—in the United Kingdom and the United States—political parties are realigning and voters’ allegiances are shifting.
When United Kingdom voters last week narrowly approved a referendum to leave the European Union, they underscored again how an era of unrelenting economic and demographic change is shifting the axis of politics across much of the industrialized world from class to culture.
Contrary to much initial speculation, the victory for the U.K. leave campaign didn’t point toward victory in the U.S. presidential election for Donald Trump, who is voicing very similar arguments against globalization and immigration; The British results, in fact, underscored the obstacles facing his agenda of defensive nationalism in the vastly more diverse U.S. electorate.
But the Brexit referendum did crystallize deepening cultural fault lines in U.K. politics that are also likely to shape the contest between Trump and Hillary Clinton. In that way, the results prefigure both a continuing long-term realignment in the electoral base of each American party—and a possible near-term reshuffle of the tipping-point states in presidential politics.
They say religious discrimination against Christians is as big a problem as discrimination against other groups.
Many, many Christians believe they are subject to religious discrimination in the United States. A new report from the Public Religion Research Institute and Brookings offers evidence: Almost half of Americans say discrimination against Christians is as big of a problem as discrimination against other groups, including blacks and minorities. Three-quarters of Republicans and Trump supporters said this, and so did nearly eight out of 10 white evangelical Protestants. Of the latter group, six in 10 believe that although America once was a Christian nation, it is no longer—a huge jump from 2012.
Polling data can be split up in a million different ways. It’s possible to sort by ethnicity, age, political party, and more. The benefit of sorting by religion, though, is that it highlights people’s beliefs: the way their ideological and spiritual convictions shape their self-understanding. This survey suggests that race is not enough to explain the sense of loss some white Americans seem to feel about their country, although it’s part of the story; the same is true of age, education level, and political affiliation. People’s beliefs seem to have a distinctive bearing on how they view changes in American culture, politics, and law—and whether they feel threatened. No group is more likely to express this fear than conservative Christians.
Rabia Chaudry, a friend of Syed’s family who spearheaded the campaign to get him a new trial, tweeted: “I am shaking with joy, shaking! Thank you Judge Welch. Thank you.” Judge Martin Welch of the Circuit Court for Baltimore City signed Thursday’s order.
Syed was convicted in 2000 of strangling Hae Min Lee, 18, and burying her body in Baltimore’s Leakin Park. The two had dated when they attended the city’s Woodlawn High School. Syed was sentenced to life plus 30 years in prison for the killing.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
American-Indian cooking has all the makings of a culinary trend, but it’s been limited by many diners’ unfamiliarity with its dishes and its loaded history.
DENVER—In 2010, the restaurateur Matt Chandra told The Atlantic that the Native American restaurant he and business partner Ben Jacobs had just opened would have 13 locations “in the near future.” But six years later, just one other outpost of their fast-casual restaurant, Tocabe, is up and running.
In the last decade, at least a handful of articles predicted that Native American food would soon see wider reach and recognition. “From the acclaimed Kai restaurant in Phoenix to Fernando and Marlene Divina's James Beard Award-winning cookbook, Foods of the Americas, to the White Earth Land Recovery Project, which sells traditional foods like wild rice and hominy, this long-overlooked cuisine is slowly gaining traction in the broader culinary landscape,” wrote Katie Robbins in her Atlantic piece. “[T]he indigenous food movement is rapidly gaining momentum in the restaurant world,” proclaimed Mic in the fall of 2014. This optimism sounds reasonable enough: The shift in the restaurant world toward more locally sourced ingredients and foraging dovetails nicely with the hallmarks of Native cuisine, which is often focused on using local crops or herds. Yet while there are a few Native American restaurants in the U.S. (there’s no exact count), the predicted rise hasn’t really happened, at least not to the point where most Americans are familiar with Native American foods or restaurants.
As it’s moved beyond the George R.R. Martin novels, the series has evolved both for better and for worse.
Well, that was more like it. Sunday night’s Game of Thrones finale, “The Winds of Winter,” was the best episode of the season—the best, perhaps, in a few seasons. It was packed full of major developments—bye, bye, Baelor; hello, Dany’s fleet—but still found the time for some quieter moments, such as Tyrion’s touching acceptance of the role of Hand of the Queen. I was out of town last week and thus unable to take my usual seat at our Game of Thrones roundtable. But I did have some closing thoughts about what the episode—and season six in general—told us about how the show has evolved.
Last season, viewers got a limited taste—principally in the storylines in the North—of how the show would be different once showrunners Benioff and Weiss ran out of material from George R.R. Martin’s novels and had to set out on their own. But it was this season in which that exception truly became the norm. Though Martin long ago supplied Benioff and Weiss with a general narrative blueprint of the major arcs of the story, they can no longer rely on the books scene by scene. Game of Thrones is truly their show now. And thanks to changes in pacing, character development, and plot streamlining, it’s also a markedly different show from the one we watched in seasons one through four—for the worse and, to some degree, for the better.
The sordid story of the Trump Institute is a sequel to the damaging tale of Trump University.
A real-estate mogul wouldn’t own just one building. So why would an education mogul own just one questionably ethical institution?
That’s one way to explain why Donald Trump wasn’t content only launching Trump University, the now-shuttered real-estate education organization that’s the subject of extensive litigation, and extensive allegations of fraud. Around the same time, he also launched the Trump Institute. The two organizations shared some characteristics in common. Both offered matriculants the chance to get rich quick on real estate, using tricks from Trump. In both cases, Trump claimed falsely that the “faculty” were handpicked by himself. And in both cases, former students alleged that they’re been ripped off with useless materials and worthless lessons.
As incomes fall across the nation, even better-off areas like Sheboygan County, Wisconsin, are faltering.
SHEBOYGAN, Wisc.—There is still a sizable middle class in this county of 115,000 on the shores of Lake Michigan, a pleasant hour’s drive from Milwaukee. You can see it in the cars that pour in and out of the parking lots of local factories, in the restaurants packed with older couples on weeknights, and in the bars that seem to be on every single corner. You can see it in the local parks, including one called Field of Dreams, where kids play soccer and baseball and their parents sit and watch.
About 63 percent of adults in Sheboygan make between $41,641 and $124,924, meaning the area has one of the highest shares of middle-class households in the country, according to a report from the Pew Research Center. Nationally, only 51 percent of adults are middle-class.