As the unemployment rate recovers faster than job
creation does, there's been much consternation about the quality of the
job market improvement. Yes, the unemployment rate has fallen to 7.8%,
but how do we account for the following chart? As it shows, since the
end of 2008 the labor force participation rate has fallen from
65.8% to 63.6%.
Aggregates can be misleading. For
instance, that surge in the participation rate from the 1960's to 1980's
is a result of women joining the workforce.
The male rate, on the other hand, has been declining since the 1950's.
Male participation has fallen under President
Obama. It fell under President George W. Bush. And President Clinton.
It's fallen in every presidential administration going back to at least
Eisenhower's, with the exception of Carter's, for whom it was flat.
Why are fewer men choosing to work? For that, we turn to the Census Bureau's 2012 Statistical Abstract.
The participation rate is lower for single men than for married men, and marriage
rates in the US have been falling for decades, so we'd expect a modest
decline from that. Looking by age bucket, it's been pretty steady for
single and married men for everyone over the age of 25 since the start
of the Great Recession.
The recent decline we've seen has been primarily
among young, single men. For single men age 16-19, participation fell by
almost 9 points from 2006-2010. For single men age 20-24 it fell by
almost 5 points. This could be for a variety of factors, from men
deciding it's not worth bothering to apply for a job at the local
grocery store, to men more focused on their education with unskilled
work harder to find, to those living at home who decide there's no need
for spending money when so much entertainment is free online.
Additionally, the acceleration in the labor force
decline began when the oldest baby boomers began turning 60. Yes,
because of deflated housing prices and retirement accounts, boomers will
work longer than they thought. But 60-year olds still work less than
30-year olds, and that demographic shift is being reflected in the data.
What's more, this decline in the workforce is part of a century-long trend towards working less in the United States. Child labor laws were passed during the Great Depression, restricting child labor. During the Truman administration, the US government instituted the 40-hour work week for federal employees. The passage of Social Security and Medicare reduced incentives for seniors to work as well.
This is a good thing. Among his many writings, John Maynard Keynes talked about an eventual 15-hour work week to satisfy the material needs of citizens. We're progressing slower than he thought, but we're getting there.
But can fewer working young adults possibly be a good thing? It's intuitive that fewer workers means less work and a smaller and weaker economy. But since the decline is mostly among very young men (and, to a lesser extent, young women) we need to understand why they're dropping out.Student loan debt outstanding has grown from $360 billion to $900 billionover the past seven years. The size of this debt is daunting, but it shows that some of the labor force decline is due to young people investing more in their education, an eventual long-term positive.
And those not dropping out for education-related reasons? If it's just a bunch of 17-year olds who are content spending their time on Facebook instead of earning a few bucks bagging groceries, that's one thing. But if it's people who feel shut out of the workforce, that's something policymakers should address.
These are issues we're going to have to grapple with, because with robotic labor on the horizon, our desire and ability to compete with emerging market and silicon-based labor, especially for less-educated Americans, is likely to continue to fall.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
The tension between religious liberty and same-sex marriage may eventually come to a head in the courts, but probably not through the Kentucky clerk’s case.
As Rowan County clerk Kim Davis crawls further and further out on a limb, Supreme Court experts agree that she has little chance of prevailing. District Judge David Bunning, on August 12 ordered Davis, in her capacity as county clerk, to issue marriage licenses to all couples who meet the statutory criteria for marriage in Kentucky—a definition that, since the Court’s landmark decision in Obergefell v. Hodges, includes same-sex couples.
Davis has refused, citing “the authority of God.” The U.S. Supreme Court, without comment, denied her emergency request for a stay. This throws the case back to the Sixth Circuit, which will hear the appeal of Judge Bunning’s order. Assuming she loses in the Sixth Circuit—a fairly good assumption—she would then have the alternative of petitioning the Supreme Court to hear her religious freedom claim. The Court will eventually hear a case about religious freedom and same-sex marriage, but I don’t think it will be this one.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
The past is beautiful until you’re reminded it’s ugly.
Taylor Swift’s music video for “Wildest Dreams” isn’t about the world as it exists; it’s about the world as seen through the filter of nostalgia and the magic of entertainment. In the song, Swift sings that she wants to live on in an ex’s memory as an idealized image of glamour—“standing in a nice dress, staring at the sunset.” In the video, her character, an actress, falls in love with her already-coupled costar, for whom she’ll live on as an idealized image of glamour—standing in a nice dress, staring at a giant fan that’s making the fabric swirl in the wind.
The setting for the most part is Africa, but, again, the video isn’t about Africa as it exists, but as it’s seen through the filter of nostalgia and the magic of entertainment—a very particular nostalgia and kind of entertainment. Though set in 1950, the video is in the literary and cinematic tradition of white savannah romances, the most important recent incarnation of which might be the 1985 Meryl Streep film Out of Africa, whose story begins in 1913. Its familiarity is part of its appeal, and also part of why it’s now drawing flack for being insensitive. As James Kassaga Arinaitwe and Viviane Rutabingwa write at NPR:
But letting customers buy their own would force cable companies to improve their equipment.
One of the least glamorous realities of the American cable industry is a relic invented in 1948: the cable box. The box has become a fixture in the American household, not least because it is surprisingly profitable. Earlier this year, a U.S. Senate study found that American households pay $231 a year on average renting cable boxes. Further, the report estimated that 99 percent of cable customers rented their equipment, and, across the country, that added up to a $19.5 billion industry just renting cable boxes.
The senators who commissioned the study, Ed Markey of Massachusetts and Richard Blumenthal of Connecticut, noted that this dependable rental revenue gave the industry little incentive to innovate and make better cable boxes. Which begs a really good question: Why aren’t more people purchasing their cable boxes?
Massive hurricanes striking Miami or Houston. Earthquakes leveling Los Angeles or Seattle. Deadly epidemics. Meet the “maximums of maximums” that keep emergency planners up at night.
For years before Hurricane Katrina, storm experts warned that a big hurricane would inundate the Big Easy. Reporters noted that the levees were unstable and could fail. Yet hardly anyone paid attention to these Cassandras until after the levees had broken, the Gulf Coast had been blown to pieces, and New Orleans sat beneath feet of water.
The wall-to-wall coverage afforded to the anniversary of Hurricane Katrina reveals the sway that a deadly act of God or man can hold on people, even 10 years later. But it also raises uncomfortable questions about how effectively the nation is prepared for the next catastrophe, whether that be a hurricane or something else. There are plenty of people warning about the dangers that lie ahead, but that doesn’t mean that the average citizen or most levels of the government are anywhere near ready for them.
Though it wasn’t pretty, Minaj was really teaching a lesson in civility.
Nicki Minaj didn’t, in the end, say much to Miley Cyrus at all. If you only read the comments that lit up the Internet at last night’s MTV Video Music Awards, you might think she was kidding, or got cut off, when she “called out” the former Disney star who was hosting: “And now, back to this bitch that had a lot to say about me the other day in the press. Miley, what’s good?”
To summarize: When Minaj’s “Anaconda” won the award for Best Hip-Hop Video, she took to the stage in a slow shuffle, shook her booty with presenter Rebel Wilson, and then gave an acceptance speech in which she switched vocal personas as amusingly as she does in her best raps—street-preacher-like when telling women “don’t you be out here depending on these little snotty-nosed boys”; sweetness and light when thanking her fans and pastor. Then a wave of nausea seemed to come over her, and she turned her gaze toward Cyrus. To me, the look on her face, not the words that she said, was the news of the night:
A Brooklyn-based group is arguing that the displacement of longtime residents meets a definition conceived by the United Nations in the aftermath of World War II.
No one will be surprised to learn that the campaign to build a national movement against gentrification is being waged out of an office in Brooklyn, New York.
For years, the borough’s name has been virtually synonymous with gentrification, and on no street in Brooklyn are its effects more evident than on Atlantic Avenue, where, earlier this summer, a local bodega protesting its impending departure in the face of a rent hike, put up sarcastic window signs advertising “Bushwick baked vegan cat food” and “artisanal roach bombs.”
Just down the block from that bodega are the headquarters of Right to the City, a national alliance of community-based organizations that since 2007 has made it its mission to fight “gentrification and the displacement of low-income people of color.” For too long, organizers with the alliance say, people who otherwise profess concern for the poor have tended to view gentrification as a mere annoyance, as though its harmful effects extended no further than the hassles of putting up with pretentious baristas and overpriced lattes. Changing this perception is the first order of business for Right to the City: Gentrification, as these organizers see it, is a human-rights violation.
Climate change means the end of our world, but the beginning of another—one with a new set of species and ecosystems.
A few years ago in a lab in Panama, Klaus Winter tried to conjure the future. A plant physiologist at the Smithsonian Tropical Research Institute, he planted seedlings of 10 tropical tree species in small, geodesic greenhouses. Some he allowed to grow in the kind of environment they were used to out in the forest, around 79 degrees Fahrenheit. Others, he subjected to uncomfortably high temperatures. Still others, unbearably high temperatures—up to a daily average temperature of 95 degrees and a peak of 102 degrees. That’s about as hot as Earth has ever been.
It’s also the kind of environment tropical trees have a good chance of living in by the end of this century, thanks to climate change. Winter wanted to see how they would do.
Every time you shrug, you don’t need to Google, then copy, then paste.
Updated, 2:20 p.m.
All hail ¯\_(ツ)_/¯.
In its 11 strokes, the symbol encapsulates what it’s like to be an individual on the Internet. With raised arms and a half-turned smile, it exudes the melancholia, the malaise, the acceptance, and (finally) the embrace of knowing that something’s wrong on the Internet and you can’t do anything about it.
As Kyle Chayka writes in a new history of the symbol at The Awl, the meaning of the “the shruggie” is always two, if not three- or four-, fold. ¯\_(ツ)_/¯ represents nihilism, “bemused resignation,” and “a Zen-like tool to accept the chaos of universe.” It is Sisyphus in unicode. I use it at least 10 times a day.
For a long time, however, I used it with some difficulty. Unlike better-known emoticons like :) or ;), ¯\_(ツ)_/¯ borrows characters from the Japanese syllabary called katakana. That makes it a kaomoji, a Japanese emoticon; it also makes it, on Western alphabetical keyboards at least, very hard to type. But then I found a solution, and it saves me having to google “smiley sideways shrug” every time I want to quickly rail at the world’s inherent lack of meaning.