Don't look at these long-term and youth unemployment numbers if you like good news
Will we, at last, have recovery in our time? That's the question in Europe, where the once omnipresent threat of euro implosion has given way to a sense that things are finally getting better. This era of good feelings even has a portmanteau: crexit. Yes, crexit. As in, "crisis exit". It's true enough, but not nearly enough. In other words, the euro crisis is over, but the economic crisis remains.
This emerging Euro-triumphalism is mostly a story about European Central Bank (ECB) chief Mario Draghi and the Baltics. Draghi single-handedly ended the panic in sovereign debt markets when he promised to do "whatever it takes" to save the common currency, while the Baltics have shown there can be growth after austerity. But there's a "but". Draghi hasn't been able to get the ECB to do anything as the euro zone, including Germany, has fallen back into recession, and the Baltics, despite their recent growth spurts, are still far below their pre-crisis peaks due to the depths of their tight money and tight budget induced slumps. Europe's real economy is still, mostly, in really bad shape -- as you can see from these terrifying numbers that Jonathan Portes highlights from the latest European Commission report. These are the new scariest charts in Europe. At least for now.
Europe's definition of "long-term unemployment" is twice as depressing as our own. In the U.S., you have to be out of work and looking for a job for six months to count as long-term unemployed. In Europe, it's 12 months. But it's not just how they define long-term unemployment that's depressing -- it's their levels of it, too. As you can see in the chart below that compares long-term unemployment rates across Europe in 2007 and 2011 (the latest year for which we have figures), it's really a tale of two continents. The PIGS (Portugal, Ireland, Greece, and Spain) and Baltics are getting crucified on a cross of euros, euro-pegged currencies, and austerity. Everybody else is doing fine to meh.
Think about it this way. Roughly 1 out of 11 people in the workforce have been unemployed for a year or more in the worst-hit countries. That's even worse than the U.S. unemployment rate overall. Big economies like France and Italy are trending in the wrong direction, with growth reversing.
That brings us to our second scariest chart. The young have taken a big part, though certainly not all, of the jobless hit -- even in the continent's better-performing economies. The reality isn't quite as bad as the stories you may have heard about half of all Greek youths being out of work, since those numbers don't account for kids in school or training programs, but it's still bad enough. As you can see in the chart below, the percent of youths (defined as aged 15 to 24) who are neither working nor in school nor receiving some kind of training is still high enough to cause serious worry. Outside of Germany, it's edged up everywhere, if not outright spiked. It's not a good time to be young in Latvia. Or Ireland. Or Greece. Or Spain. Or Italy.
The toxic combination of careers deferred and discontinued for long periods can create what economists call "hysteresis" -- permanent damage to the economy. There's a stigma to being out of work for too long, or starting a career too late, that is difficult to overcome, short of an economic boom. Patting yourself on the back when so much remains to be done defines down success so far that failure becomes impossible -- and so will genuine success, in the future.
Europe's policymakers need some Rooseveltian, if not Churchillian, resolve in the face of mass unemployment. In other words, aggressive ECB bond-buying and fiscal expansion in the countries that can afford fiscal expansion (which will spillover into the countries that cannot). Anything less is just appeasement of inflation hawks and deficit scolds intent on winning a phony war against phantom opponents.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
The results of the referendum are, in theory, not legally binding.
Lest we think the Euroskepticism displayed this week by British voters is new, let me present a scene from the BBC’s Yes, Minister, a comedy about the U.K. civil service’s relationship with a minister. The series ran from 1980 to ’84 (and, yes, it was funny), at a time when the European Union was a mere glint in its founders’ eyes.
The Europe being referred to in the scene is the European Economic Community (EEC), an eventually 12-member bloc established in the mid-1950s, to bring about greater economic integration among its members.
In many ways, the seeds of the U.K.’s Thursday referendum on its membership in the European Union were sown soon after the country joined the now-defunct EEC in 1973. Then, as now, the ruling Conservative Party and opposition Labour, along with the rest of the country, were deeply divided over the issue. In the run-up to the general election the following year, Labour promised in its manifesto to put the U.K.’s EEC membership to a public referendum. Labour eventually came to power and Parliament passed the Referendum Act in 1975, fulfilling that campaign promise. The vote was held on June 5, 1975, and the result was what the political establishment had hoped for: an overwhelming 67 percent of voters supported the country’s EEC membership.
Unexpected discoveries in the quest to cure an extraordinary skeletal condition show how medically relevant rare diseases can be.
When Jeannie Peeper was born in 1958, there was only one thing amiss: her big toes were short and crooked. Doctors fitted her with toe braces and sent her home. Two months later, a bulbous swelling appeared on the back of Peeper’s head. Her parents didn’t know why: she hadn’t hit her head on the side of her crib; she didn’t have an infected scratch. After a few days, the swelling vanished as quickly as it had arrived.
When Peeper’s mother noticed that the baby couldn’t open her mouth as wide as her sisters and brothers, she took her to the first of various doctors, seeking an explanation for her seemingly random assortment of symptoms. Peeper was 4 when the Mayo Clinic confirmed a diagnosis: she had a disorder known as fibrodysplasia ossificans progressiva (FOP).
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
The Republican candidate is deeply unpopular, and his Democratic rival is promoting her own version of American nationalism.
American commentators have spent the weekend pondering the similarities between Britain’s vote to leave the European Union and America’s impending vote on whether to take leave it of its senses by electing Donald Trump. The similarities have been well-rehearsed: The supporters of Brexit—like the supporters of Trump--are older, non-college educated, non-urban, distrustful of elites, xenophobic, and nostalgic. Moreover, many British commentators discounted polls showing that Brexit might win just as many American commentators, myself very much included, discounted polls showing that Trump might win the Republican nomination. Brexit may even result in the installation this fall of a new British prime minister, Boris Johnson, who is entertaining, self-promoting, vaguely racist, doughy, and orange. It’s all too familiar.
The June 23 vote represents a huge popular rebellion against a future in which British people feel increasingly crowded within—and even crowded out of—their own country.
I said goodnight to a gloomy party of Leave-minded Londoners a few minutes after midnight. The paper ballots were still being counted by hand. Only the British overseas territory of Gibraltar had reported final results. Yet the assumption of a Remain victory filled the room—and depressed my hosts. One important journalist had received a detailed briefing earlier that evening of the results of the government’s exit polling: 57 percent for Remain.
The polling industry will be one victim of the Brexit vote. A few days before the vote, I met with a pollster who had departed from the cheap and dirty methods of his peers to perform a much more costly survey for a major financial firm. His results showed a comfortable margin for Remain. Ten days later, anyone who heeded his expensive advice suffered the biggest percentage losses since the 2008 financial crisis.
The city is riding high after the NBA final. But with the GOP convention looming, residents are bracing for disappointment.
Cleveland’s in a weird mood.
My son and I attended the Indians game on Father’s Day, the afternoon before game seven of the NBA Finals—which, in retrospect, now seems like it should be blockbustered simply as The Afternoon Before—when the Cavaliers would take on the Golden State Warriors and bring the city its first major-league sports championship in 52 years.
I am 52 years old. I’ve lived in Northeast Ohio all my life. I know what Cleveland feels like. And it’s not this.
In the ballpark that day, 25,269 of us sat watching a pitcher’s duel, and the place was palpably subdued. The announcer and digitized big-screen signage made no acknowledgement of the city’s excitement over the Cavaliers. There were no chants of “Let’s Go Cavs,” no special seventh-inning-stretch cheer for the Indians’ basketball brothers, who play next door in the Quicken Loans Arena, which in a few weeks will host the Republican National Convention.
There's a reason why the first thing we often ask someone when we meet them, right after we learn their name, is "where's home for you?"
There's a reason why the first thing we often ask someone when we meet them, right after we learn their name, is "where's home for you?"
My house is a shrine to my homes. There's a triptych of sunsets next to my bedroom door, dusk forever falling over the small Michigan town where I grew up, the beach next to my college dorm and Place de la Concorde in Paris, where I spent a cliché but nonetheless happy semester. And that's only the beginning. Typographic posters of Michigan and Chicago hang above my bed, a photo of taxis zooming around Manhattan sits atop my dresser and a postcard of my hometown's famous water tower is taped to my door. My roommate and I have an entire wall in our kitchen plastered with maps of places we've been, and twin Ferris wheels, one at Navy Pier, one at Place de la Concorde, are stacked on top of one another in my living room.
A hotly contested, supposedly ancient manuscript suggests Christ was married. But believing its origin story—a real-life Da Vinci Code, involving a Harvard professor, a onetime Florida pornographer, and an escape from East Germany—requires a big leap of faith.
On a humid afternoon this past November, I pulled off Interstate 75 into a stretch of Florida pine forest tangled with runaway vines. My GPS was homing in on the house of a man I thought might hold the master key to one of the strangest scholarly mysteries in recent decades: a 1,300-year-old scrap of papyrus that bore the phrase “Jesus said to them, My wife.” The fragment, written in the ancient language of Coptic, had set off shock waves when an eminent Harvard historian of early Christianity, Karen L. King, presented it in September 2012 at a conference in Rome.
Never before had an ancient manuscript alluded to Jesus’s being married. The papyrus’s lines were incomplete, but they seemed to describe a dialogue between Jesus and the apostles over whether his “wife”—possibly Mary Magdalene—was “worthy” of discipleship. Its main point, King argued, was that “women who are wives and mothers can be Jesus’s disciples.” She thought the passage likely figured into ancient debates over whether “marriage or celibacy [was] the ideal mode of Christian life” and, ultimately, whether a person could be both sexual and holy.
Why professors, librarians, and politicians are shunning liberal arts in the name of STEM
I have been going to academic conferences since I was about 12 years old. Not that I am any sort of prodigy—both of my parents are, or were at one point, academics, so I was casually brought along for the ride. I spent the bulk of my time at these conferences in hotel lobbies, transfixed by my Game Boy, waiting for my mother to be done and for it to be dinnertime. As with many things that I was made to do as a child, however, I eventually came to see academic conferences as an integral part of my adult life.
So it was that, last year, I found myself hanging out at the hotel bar at the annual conference of the Modern Language Association, despite the fact that I am not directly involved with academia in any meaningful way. As I sipped my old fashioned, I listened to a conversation between several aging literature professors about the “digital humanities,” which, as far as I could tell, was a needlessly jargonized term for computers in libraries and writing on the Internet. The digital humanities were very “in” at MLA that year. They had the potential, said a white-haired man in a tweed jacket, to modernize and reinvigorate humanistic scholarship, something that all involved seemed to agree was necessary. The bespectacled scholars nodded their heads with solemn understanding, speaking in hushed tones about how they wouldn’t be making any new tenure-track hires that year.