As the unemployment rate recovers faster than job
creation does, there's been much consternation about the quality of the
job market improvement. Yes, the unemployment rate has fallen to 7.8%,
but how do we account for the following chart? As it shows, since the
end of 2008 the labor force participation rate has fallen from
65.8% to 63.6%.
Aggregates can be misleading. For
instance, that surge in the participation rate from the 1960's to 1980's
is a result of women joining the workforce.
The male rate, on the other hand, has been declining since the 1950's.
Male participation has fallen under President
Obama. It fell under President George W. Bush. And President Clinton.
It's fallen in every presidential administration going back to at least
Eisenhower's, with the exception of Carter's, for whom it was flat.
Why are fewer men choosing to work? For that, we turn to the Census Bureau's 2012 Statistical Abstract.
The participation rate is lower for single men than for married men, and marriage
rates in the US have been falling for decades, so we'd expect a modest
decline from that. Looking by age bucket, it's been pretty steady for
single and married men for everyone over the age of 25 since the start
of the Great Recession.
The recent decline we've seen has been primarily
among young, single men. For single men age 16-19, participation fell by
almost 9 points from 2006-2010. For single men age 20-24 it fell by
almost 5 points. This could be for a variety of factors, from men
deciding it's not worth bothering to apply for a job at the local
grocery store, to men more focused on their education with unskilled
work harder to find, to those living at home who decide there's no need
for spending money when so much entertainment is free online.
Additionally, the acceleration in the labor force
decline began when the oldest baby boomers began turning 60. Yes,
because of deflated housing prices and retirement accounts, boomers will
work longer than they thought. But 60-year olds still work less than
30-year olds, and that demographic shift is being reflected in the data.
What's more, this decline in the workforce is part of a century-long trend towards working less in the United States. Child labor laws were passed during the Great Depression, restricting child labor. During the Truman administration, the US government instituted the 40-hour work week for federal employees. The passage of Social Security and Medicare reduced incentives for seniors to work as well.
This is a good thing. Among his many writings, John Maynard Keynes talked about an eventual 15-hour work week to satisfy the material needs of citizens. We're progressing slower than he thought, but we're getting there.
But can fewer working young adults possibly be a good thing? It's intuitive that fewer workers means less work and a smaller and weaker economy. But since the decline is mostly among very young men (and, to a lesser extent, young women) we need to understand why they're dropping out.Student loan debt outstanding has grown from $360 billion to $900 billionover the past seven years. The size of this debt is daunting, but it shows that some of the labor force decline is due to young people investing more in their education, an eventual long-term positive.
And those not dropping out for education-related reasons? If it's just a bunch of 17-year olds who are content spending their time on Facebook instead of earning a few bucks bagging groceries, that's one thing. But if it's people who feel shut out of the workforce, that's something policymakers should address.
These are issues we're going to have to grapple with, because with robotic labor on the horizon, our desire and ability to compete with emerging market and silicon-based labor, especially for less-educated Americans, is likely to continue to fall.
Three Atlantic staffers discuss “Beyond the Wall,” the sixth episode of the seventh season.
Every week for the seventh season of Game of Thrones, three Atlantic staffers will discuss new episodes of the HBO drama. Because no screeners were made available to critics in advance this year, we'll be posting our thoughts in installments.
“Seeing a partial eclipse bears the same relation to seeing a total eclipse as kissing a man does to marrying him.”
Ever since it was first published in 1982, readers—including this one—have thrilled to “Total Eclipse,” Annie Dillard’s masterpiece of literary nonfiction, which describes her personal experience of a solar eclipse in Washington State. It first appeared in Dillard’s landmark collection, Teaching a Stone to Talk, and was recently republished in The Abundance, a new anthology of her work. The Atlantic is pleased to offer the essay in full, here, until the day after the ‘Great American Eclipse’ on August 21.
It had been like dying, that sliding down the mountain pass. It had been like the death of someone, irrational, that sliding down the mountain pass and into the region of dread. It was like slipping into fever, or falling down that hole in sleep from which you wake yourself whimpering. We had crossed the mountains that day, and now we were in a strange place—a hotel in central Washington, in a town near Yakima. The eclipse we had traveled here to see would occur early in the next morning.
The cartoonist defended the president in a podcast debate with Sam Harris. The portrait he painted of Trump supporters was not flattering.
Sam Harris, the atheist philosopher and neuroscientist, has recently been using his popular Waking Up podcast to discuss Donald Trump, whom he abhors, with an ideologically diverse series of guests, all of whom believe that the president is a vile huckster.
This began to wear on some of his listeners. Wasn’t Harris always warning against echo chambers? Didn’t he believe in rigorous debate with a position’s strongest proponents? At their urging, he extended an invitation to a person that many of those listeners regard as President Trump’s most formidable defender: Scott Adams, the creator of the cartoon Dilbert, who believes that Trump is “a master persuader.”
Their conversation was posted online late last month. It is one of the most peculiar debates about a president I have ever encountered. And it left me marveling that parts of Trump’s base think well of Adams when his views imply such negative things about them.
The scientists are all talking like it’s a sure thing.
On August 21, the “moon” will pass between the Earth and the sun, obscuring the light of the latter. The government agency NASA says this will result in “one of nature’s most awe-inspiring sights.” The astronomers there claim to have calculated down to the minute exactly when and where this will happen, and for how long. They have reportedly known about this eclipse for years, just by virtue of some sort of complex math.
This seems extremely unlikely. I can’t even find these eclipse calculations on their website to check them for myself.
Meanwhile the scientists tell us we can’t look at it without special glasses because “looking directly at the sun is unsafe.”
A tour of the solar eclipse’s path reveals a nation that fought to maintain a different sort of totality.
Totality is everything, say those who chase solar eclipses. When the moon fully obscures the sun and casts its shadow on Earth, the result is like nothing you’ve seen before—not even a partial eclipse. A merely partial eclipse does not flip day to night, because the sun is bright enough to light our fields of vision with only a tiny fraction of its power. But when the sun and moon align just so, a little piece of Earth goes dark in the middle of the day. In this path of totality, night comes suddenly and one can see the shape of the moon as a circle darker than black, marked by the faint backlight of the sun’s corona. Astronomers and eclipse chasers chart carefully to be sure that they can watch from exactly the right place at the right time. They know that you cannot compromise with the sun. For a dark sky, the sun must be banished altogether.
“Medicare for all” is a popular idea, but for Americans, transitioning to such a system would be difficult, to say the least.
French women supposedly don’t get fat, and in the minds of many Americans, they also don’t get stuck with très gros medical bills. There’s long been a dream among some American progressives to truly live as the “Europeans1” do and have single-payer health care.
Republicans’ failure—so far—to repeal and replace Obamacare has breathed new life into the single-payer dream. In June, the majority of Americans told Pew that the government has the responsibility to ensure health coverage for everyone, and 33 percent say this should take the form of a single government program. The majority of Democrats, in that poll, supported single payer. A June poll from the Kaiser Family Foundation even found that a slim majority of all Americans favor single payer.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
In recent decades higher-education institutions have tried to lure students with extravagant amenities, but some are finding that these attempts can actually threaten enrollment and retention.
When I was a college freshman in the early 1990s, I lived in a dorm that was as sterile as a hospital room, a 193-square-foot box with white cinderblock walls that I shared with two other guys. The bathroom was also shared—with an entire floor.
Such basic living quarters greeted generations of college students before me. For much of the history of American higher education, dorms and other student amenities—from dining halls to recreational centers—were an afterthought to the primary business of campus planning: grand academic buildings. In fact, in the 1840s, the president of Brown University described dorm life as “unnatural” and blamed student housing for most of the evils of college life. Brown and Columbia University even attempted to eliminate dorms from their campuses. While that move ultimately proved unsuccessful, student housing and other facilities that supported student life on most campuses remained a fairly spartan experience up until the 1990s.
The country’s exceptionally thin safety net prompts residents—especially those with less-steady employment—to view partnership in more economic terms.
Over the last several decades, the proportion of Americans who get married has greatly diminished—a development known as well to those who lament marriage’s decline as those who take issue with it as an institution. But a development that’s much newer is that the demographic now leading the shift away from tradition is Americans without college degrees—who just a few decades ago were much more likely to be married by the age of 30 than college graduates were.
Today, though, just over half of women in their early 40s with a high-school degree or less education are married, compared to three-quarters of women with a bachelor’s degree; in the 1970s, there was barely a difference. The marriage gap for men has changed less over the years, but there the trend lines have flipped too: Twenty-five percent of men with high-school degrees or less education have never married, compared to 23 percent of men with bachelor’s degrees and 14 percent of those with advanced degrees. Meanwhile, divorce rates have continued to rise among the less educated, while staying more or less steady for college graduates in recent decades.
The nation’s current post-truth moment is the ultimate expression of mind-sets that have made America exceptional throughout its history.
When did America become untethered from reality?
I first noticed our national lurch toward fantasy in 2004, after President George W. Bush’s political mastermind, Karl Rove, came up with the remarkable phrase reality-based community. People in “the reality-based community,” he told a reporter, “believe that solutions emerge from your judicious study of discernible reality … That’s not the way the world really works anymore.” A year later, The Colbert Report went on the air. In the first few minutes of the first episode, Stephen Colbert, playing his right-wing-populist commentator character, performed a feature called “The Word.” His first selection: truthiness. “Now, I’m sure some of the ‘word police,’ the ‘wordinistas’ over at Webster’s, are gonna say, ‘Hey, that’s not a word!’ Well, anybody who knows me knows that I’m no fan of dictionaries or reference books.