The welfare state is dead. Long live the welfare state!
It's getting hard to keep track of which countries aren't Greece anymore.
First, Ireland wasn't Greece. Then it kind of was. Then it was Portugal's turn to not be Greece. Then it was Portugal's turn to be Greece. Next, Spain wasn't Greece. But now it might be. At the very least it's Ireland. Although Uganda looks like it's in the clear. It's not Spain, which could be Greece. That's better than Cyprus can say. They're pretty much Greece. And, of course, Greece is almost certainly Greece. That goes without saying.
But there's one country that definitely isn't Greece. That's the United States.
Let's step back. What makes a country "Greece"? It's become shorthand for wild government overspending -- especially on entitlements. Paul Ryan says we don't have long to avoid the same fate. Neither does the terrifyingly successful investor Michael Burry. They think that absent drastic reform -- read: cuts -- to the social safety net, we'll end up in penury like the Greeks.
It's a scary story. But it's just a scare story. Yes, we have a long-term healthcare spending problem. But that doesn't make us Greece. Heck, Greece isn't even Greece. At least not the "Greece" that's become such a political football. The evidence -- or lack thereof -- is in the chart below. It compares each country's average social spending since 1999, via the OECD, against its current borrowing costs. See the pattern?
There is none. Europe's biggest social spenders don't have any problems. And Europe's biggest problem countries don't spend that much on social programs. The death knell of the welfare state this is not.
Here's the dirty little secret of the euro debt crisis. There is no euro debt crisis. There is a euro crisis. The debt is a symptom of the crisis of the common currency.* Europe's bailed out countries all saw piles of capital pour in during the boom, only to pour out during the bust. They were left with inflated, uncompetitive wages -- and that's sent them into deep slumps. That's been despite lower social spending than their northern euro neighbors. Germany, Austria, Finland, Finland, the Netherlands, Belgium and -- at least for now -- France have all been able to sustain more generous safety nets thanks to the magic of competitive wages.
It's the same story for Europe's non-euro nations. Sweden, Denmark, Norway, Switzerland and the Czech Republic are all lucky enough to not be passengers on the Titantic members of the common currency. (Denmark has pegged its krone to the euro, but they still have their own central bank). Most of them spend more on social programs than the so-called PIIGS, but all of them can borrow for almost nothing. Investors are actually paying the Swiss and Danish governments for the privilege of lending to them short-term. Think about that. What's going on? Well, if things ever get rough, they can just print money or devalue their currencies. In other words, they can never run out of money.
But Greece can. Being in the euro means never being able to print your own money. And that turns each euro country into a bank. Imagine a bank run. Fear becomes self-fulfilling. Depositors try to pull their money out before everyone else because they're worried the bank will collapse -- which, of course, causes the bank's collapse. Very Oedipal -- minus the parent love. It's the same with Greece. Investors worry that Greece will run out of euros. That's a very rational fear right now. So they try to sell-off their bonds, which pushes up Greece's borrowing costs -- and makes it more likely that Greece will run out of euros. This kind of panic is why Italy -- which has a primary surplus! -- is flirting with trouble too. Only the ECB can stop this.
Notice that I didn't talk about debt at all in the previous paragraph. The PIIGS have too-high wages, too little growth, and face crippling crises of confidence. Austerity won't cure any of that. It'll make things worse. It has. It kneecaps growth. And investors are more worried about growth right now than they are deficits.
Also notice that none of this applies to the United States. We never have to worry about self-fulfilling prophesies of bankruptcy because we can never run out of dollars. As the Boomers retire, we'll spend more on entitlements. That's not the end of the world. Unless you think Sweden is the end of the world. Yes, we need to rein in healthcare inflation, and, yes, we need to raise some more revenue. The former might already be happening. The latter is a political choice. Neither makes us Greece.
So don't believe the rumors of the welfare state's death. They're greatly exaggerated.
* Caveat: Greece is sui generis. They really did just spend too much money. They're not pictured here, because their 10-year bond yield is -- wait for it -- off the chart. Fitting their 27 percent borrowing costs onto this graph makes it too hard to see anything else. But Greece's average social spending is only 21.4 percent of GDP.
A new report details a black market in nuclear materials.
On Wednesday, the Associated Press published a horrifying report about criminal networks in the former Soviet Union trying to sell “radioactive material to Middle Eastern extremists.” At the center of these cases, of which the AP learned of four in the past five years, was a “thriving black market in nuclear materials” in a “tiny and impoverished Eastern European country”: Moldova.
It’s a new iteration of an old problem with a familiar geography. The breakup of the Soviet Union left a superpower’s worth of nuclear weapons scattered across several countries without a superpower’s capacity to keep track of them. When Harvard’s Graham Allison flagged this problem in 1996, he wrote that the collapse of Russia’s “command-and-control society” left nothing secure. To wit:
The leaderless GOP begins its search for a speaker anew, starting with a campaign to draft Paul Ryan.
First Eric Cantor. Then John Boehner. Now Kevin McCarthy.
Conservatives in and out of Congress have, within a span of 15 months, tossed aside three of the four men most instrumental in the 2010 victory that gave Republicans their majority in the House. When the leaderless and divided party gathers on Friday to begin anew its search for a speaker, the biggest question will be whether that fourth man, Paul Ryan, will take a job that for the moment, only he can win.
Ryan, the 2012 vice presidential nominee and chairman of the powerful Ways and Means Committee, has for years resisted entreaties to run for speaker, citing the demands of the job on his young family and his desire to run the tax-writing panel, which he has called his “dream job.” And he did so again on Thursday, within minutes of McCarthy’s abrupt decision to abandon a race he had been favored to win. “I will not be a candidate for speaker,” Ryan tweeted. Yet the pressure kept coming. Lawmakers brought up his name throughout the day, and there were reports that Boehner himself had personally implored him to change his mind.
In a new book, the former Middle East peace negotiator Dennis Ross explores just how close Israel came to attacking Iran, and why Susan Rice accused Benjamin Netanyahu of throwing “everything but the n-word” at Barack Obama.
When Israeli Prime Minister Benjamin Netanyahu arrives in Washington early next month for a meeting with President Obama, he should at least know that he is more popular in the White House than Vladimir Putin. But not by much.
This meeting will not reset the relationship between the two men in any significant way, and not only because Netanyahu has decided to troll Obama by accepting the Irving Kristol Award from the American Enterprise Institute on this same short trip. The meeting between the two leaders will most likely be businesslike and correct, but the gap between the two is essentially unbridgeable. From Netanyahu’s perspective, the hopelessly naive Obama broke a solemn promise to never allow Iran to cross the nuclear threshold. From Obama’s perspective, Netanyahu violated crucial norms of U.S.-Israel relations by publicly and bitterly criticizing an Iran deal that—from Obama’s perspective—protects Israel, and then by taking the nearly unprecedented step of organizing a partisan (and, by the way, losing and self-destructive) lobbying campaign against the deal on Capitol Hill.
Some of Charles Schulz’s fans blame the cartoon dog for ruining Peanuts. Here’s why they’re wrong.
It really was a dark and stormy night. On February 12, 2000, Charles Schulz—who had single-handedly drawn some 18,000 Peanuts comic strips, who refused to use assistants to ink or letter his comics, who vowed that after he quit, no new Peanuts strips would be made—died, taking to the grave, it seemed, any further adventures of the gang.
Hours later, his last Sunday strip came out with a farewell: “Charlie Brown, Snoopy, Linus, Lucy … How can I ever forget them.” By then, Peanuts was carried by more than 2,600 newspapers in 75 countries and read by some 300 million people. It had been going for five decades. Robert Thompson, a scholar of popular culture, called it “arguably the longest story told by a single artist in human history.”
American politicians are now eager to disown a failed criminal-justice system that’s left the U.S. with the largest incarcerated population in the world. But they've failed to reckon with history. Fifty years after Daniel Patrick Moynihan’s report “The Negro Family” tragically helped create this system, it's time to reclaim his original intent.
By his own lights, Daniel Patrick Moynihan, ambassador, senator, sociologist, and itinerant American intellectual, was the product of a broken home and a pathological family. He was born in 1927 in Tulsa, Oklahoma, but raised mostly in New York City. When Moynihan was 10 years old, his father, John, left the family, plunging it into poverty. Moynihan’s mother, Margaret, remarried, had another child, divorced, moved to Indiana to stay with relatives, then returned to New York, where she worked as a nurse. Moynihan’s childhood—a tangle of poverty, remarriage, relocation, and single motherhood—contrasted starkly with the idyllic American family life he would later extol.
Forget the Common Core, Finland’s youngsters are in charge of determining what happens in the classroom.
“The changes to kindergarten make me sick,” a veteran teacher in Arkansas recently admitted to me. “Think about what you did in first grade—that’s what my 5-year-old babies are expected to do.”
The difference between first grade and kindergarten may not seem like much, but what I remember about my first-grade experience in the mid-90s doesn’t match the kindergarten she described in her email: three and a half hours of daily literacy instruction, an hour and a half of daily math instruction, 20 minutes of daily “physical activity time” (officially banned from being called “recess”) and two 56-question standardized tests in literacy and math—on the fourth week of school.
That American friend—who teaches 20 students without an aide—has fought to integrate 30 minutes of “station time” into the literacy block, which includes “blocks, science, magnetic letters, play dough with letter stamps to practice words, books, and storytelling.” But the most controversial area of her classroom isn’t the blocks nor the stamps: Rather, it’s the “house station with dolls and toy food”—items her district tried to remove last year. The implication was clear: There’s no time for play in kindergarten anymore.
The United States, which accepts more refugees per year than any other country, has all but closed its door to the millions of Syrians who are part of the world’s largest refugee crisis since World War II. A recent decision to admit more Syrian refugees this year opened that door a crack, but the Obama administration insists that national security concerns constrain it from going further. Yet officials at more than a dozen agencies could not point to any specific or credible case, data, or intelligence assessment indicating that Syrian refugees pose a threat.
The officials generally funneled questions to the Department of Homeland Security.
“Certain groups have openly stated they will attempt to exploit the current situation with respect to large numbers of migrants seeking asylum in Europe and refugee resettlement,” said a DHS official, who spoke on condition of anonymity because department leaders would not authorize anyone to speak on the record about the threat assessment of Syrian refugees. “We must balance a very real threat with the potential propaganda value here.”
Somewhere in Europe, a man who goes by the name “Mikro” spends his days and nights targeting Islamic State supporters on Twitter.
In August 2014, a Twitter account affiliated with Anonymous, the hacker-crusader collective, declared “full-scale cyber war” against ISIS: “Welcome to Operation Ice #ISIS, where #Anonymous will do it’s [sic] part in combating #ISIS’s influence in social media and shut them down.”
In July, I traveled to a gloomy European capital city to meet one of the “cyber warriors” behind this operation. Online, he goes by the pseudonym Mikro. He is vigilant, bordering on paranoid, about hiding his actual identity, on account of all the death threats he has received. But a few months after I initiated a relationship with him on Twitter, Mikro allowed me to visit him in the apartment he shares with his girlfriend and two Rottweilers. He works alone from his chaotic living room, using an old, battered computer—not the state-of-the-art setup I had envisaged. On an average day, he told me, he spends up to 16 hours fixed to his sofa. He starts around noon, just after he wakes up, and works late into the night and early morning.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Why Americans tend more and more to want inexperienced presidential candidates
The presidency, it’s often said, is a job for which everyone arrives unprepared. But just how unprepared is unprepared enough?
Political handicappers weigh presidential candidates’ partisanship, ideology, money, endorsements, consultants, and, of course, experience. Yet they too rarely consider an element of growing importance to voters: freshness. Increasingly, American voters view being qualified for the presidency as a disqualification.
In 2003, I announced in National Journal the 14-Year Rule. The rule was actually discovered by a presidential speechwriter named John McConnell, but because his job required him to keep his name out of print, I graciously stepped up to take credit. It is well known that to be elected president, you pretty much have to have been a governor or a U.S. senator. What McConnell had figured out was this: No one gets elected president who needs longer than 14 years to get from his or her first gubernatorial or Senate victory to either the presidency or the vice presidency.* Surprised, I scoured the history books and found that the rule works astonishingly well going back to the early 20th century, when the modern era of presidential electioneering began.