With the release of this month's unemployment report, we now have a chance to take full stock of what happened to the U.S. job market in 2011. In this politically tumultuous year, employment crawled upwards. Slowly.
Overall, total non-farm employment inched higher by roughly 1.6 million jobs, or about 1.3%. The private sector grew modestly. The public sector shrank, also modestly. The United States economy is still about 6 million jobs short of where it was before the beginning of the Great Recession. And while the unemployment rate is down to 8.5% from 9.4%, it's partly because so many workers have given up on job hunting.
That's the Cliff's Notes version. Beneath the headline figures, America's employment picture is vastly more complicated. If you were a white, or college educated, or in the oil business, odds are you had a fabulous year. For African Americans, high school drop-outs, teachers, and 19-year-olds looking for work, the numbers told a very different story.
A Great Year For Oil Workers, A Terrible Year for Teachers
In 2011, the fastest growing industry sector by employment was mining. By a longshot. Jobs in logging and mining as a combined sector increased by 12.4%, but virtually all of that growth was due to mining -- coal, oil, and gas extraction, as well as the support activities around them. Thank the oil boom in North Dakota and the hunt for natural gas in Appalachia's shale deposits. As you can see in the graph below, no other major industry saw even close to that rate of growth.
But while mining's growth was dramatic, it only contributed a small piece to 2011's overall employment bump -- about 91,000 new hires. The largest boost came from business services, a hodge-podge category encompassing a wide variety of white collar employees. Its growth was powered by increased demand for highly educated workers such as engineers and architects, computer systems designers, and accountants. Administrative support positions, including roughly 90,000 new workers in temp agencies, also made up much of the growth. Other important pieces of the job growth puzzle included health care and social assistance, which added 350,000 workers, and the hospitality businesses, which added 230,000 workers in food services alone.
It's part of an evolving split in the American workforce: On the one hand, we're growing high-skilled jobs in offices and hospitals. On the other, we're producing low-wage service jobs. There's not a ton being created in the middle. Even this year's manufacturing growth only reclaimed a small portion of the millions of factory jobs lost to the economic downturn.
The gloomiest portion of this chart, however, is reserved for government hiring. In a year without the cushion of stimulus spending, local, state, and -- yes -- federal government employment rolls all shrank, shedding a total of 280,000 workers. Public schools let go 113,000 workers alone. To put that in perspective, the loss of government jobs eclipsed the entire growth of manufacturing and construction combined.
A Bad Time to Be Young, or Without A College Degree
More than their industry, however, the most important factor affecting workers ability to get hired in 2011 was their education. At Slate, Matt Yglesias posted this chart showing that more than half of the jobs added went to Americans with a college education. High school graduates, meanwhile, lost half a million jobs.
Beyond education, the next great divide in 2011 remained age. For women and men over the age of 20, the unemployment rate was about 8%. For those aged 16 to 19, the unemployment rate was 23.1%, down from 25.2% a year ago. For black youth, the unemployment rate was a staggering 44%, down from 42% a year before.
Overall African American unemployment refused to budge during the year, staying at exactly 15.8%. The slimming of government payrolls may be the major culprit since, as the New York Timeshas reported, one in five black workers is a public sector employee. Whites and Hispanics, meanwhile, saw unemployment drop from 8.5% to 7.5% and from 11.0% from 12.9%, respectively.
The jobs numbers in 2011 weren't spectacular for your group, no matter where you fit into the jobs picture. But your age, education, and industry made a huge difference.
The president touched off a brief firestorm with the unfounded charge, but real answers about why four service members were killed in Niger remain elusive.
On October 4, four American Special Forces soldiers were killed during an operation in Niger. Since then, the White House has been notably tight-lipped about the incident. During a press conference Monday afternoon, 12 days after the deaths, President Trump finally made his first public comments, but the remarks—in which he admitted he had not yet spoken with the families and briefly attacked Barack Obama—did little to clarify what happened or why the soldiers were in Niger.
Trump spoke at the White House after a meeting with Senate Majority Leader Mitch McConnell, and was asked why he hadn’t spoken about deaths of Sergeant La David Johnson and Staff Sergeants Bryan Black, Dustin Wright, and Jeremiah Johnson.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
About 10 years ago, after I’d graduated college but when I was still waitressing full-time, I attended an empowerment seminar. It was the kind of nebulous weekend-long event sold as helping people discover their dreams and unburden themselves from past trauma through honesty exercises and the encouragement to “be present.” But there was one moment I’ve never forgotten. The group leader, a man in his 40s, asked anyone in the room of 200 or so people who’d been sexually or physically abused to raise their hands. Six or seven hands tentatively went up. The leader instructed us to close our eyes, and asked the question again. Then he told us to open our eyes. Almost every hand in the room was raised.
And there could be far-reaching consequences for the national economy too.
Four floors above a dull cinder-block lobby in a nondescript building at the Ohio State University, the doors of a slow-moving elevator open on an unexpectedly futuristic 10,000-square-foot laboratory bristling with technology. It’s a reveal reminiscent of a James Bond movie. In fact, the researchers who run this year-old, $750,000 lab at OSU’s Spine Research Institute resort often to Hollywood comparisons.
Thin beams of blue light shoot from 36 of the same kind of infrared motion cameras used to create lifelike characters for films like Avatar. In this case, the researchers are studying the movements of a volunteer fitted with sensors that track his skeleton and muscles as he bends and lifts. Among other things, they say, their work could lead to the kind of robotic exoskeletons imagined in the movie Aliens.
The two big headlines, pulling the plug on subsidies in Obamacare insurance markets and tossing the Iran nuclear deal to Congress, are both highly fraught. Yet with these two decisions, President Trump has brought himself closer to following through on major campaign promises than nearly anything else he has done as president.
There are two notable things about the moves. First, they are both incomplete. President Trump has neither repealed and replaced Obamacare, nor has he shredded the Iran deal. Second, they have real potential downsides. Ending the Obamacare subsidies could end with millions of people losing their health insurance, a disaster both moral and, potentially, political. And decertifying the Iran deal could allow it to build nuclear weapons, and undermine American credibility in the Middle East and beyond for decades to come. Taken together, though, they show how Trump’s accomplishments at this stage in his presidency are almost entirely destructive, rather than constructive. Trump made his reputation as a builder, but he’s made demolition his mode in the White House.
In the media world, as in so many other realms, there is a sharp discontinuity in the timeline: before the 2016 election, and after.
Things we thought we understood—narratives, data, software, news events—have had to be reinterpreted in light of Donald Trump’s surprising win as well as the continuing questions about the role that misinformation and disinformation played in his election.
Tech journalists covering Facebook had a duty to cover what was happening before, during, and after the election. Reporters tried to see past their often liberal political orientations and the unprecedented actions of Donald Trump to see how 2016 was playing out on the internet. Every component of the chaotic digital campaign has been reported on, here at The Atlantic, and elsewhere: Facebook’s enormous distribution power for political information, rapacious partisanship reinforced by distinct media information spheres, the increasing scourge of “viral” hoaxes and other kinds of misinformation that could propagate through those networks, and the Russian information ops agency.
The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.
It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.
Four decades ago Jimmy Carter was sworn in as the 39th president of the U.S., the original Star Wars movie was released in theaters, and much more.
Four decades ago Jimmy Carter was sworn in as the 39th president of the United States, the original Star Wars movie was released in theaters, the Trans-Alaska pipeline pumped its first barrels of oil, New York City suffered a massive blackout, Radio Shack introduced its new TRS-80 Micro Computer, Grace Jones was a disco queen, the Brazilian soccer star Pele played his “sayonara” game in Japan, and much more. Take a step into a visual time capsule now, for a brief look at the year 1977.
How a seemingly innocuous phrase became a metonym for the skewed sexual politics of show business
The chorus of condemnation against Harvey Weinstein, as dozens of women have come forward to accuse the producer of serial sexual assault and harassment, has often turned on a quaint-sounding show-business cliché: the “casting couch.” Glenn Close, for instance, expressed her anger that “the ‘casting couch’ phenomenon, so to speak, is still a reality in our business and in the world.”
The casting couch—where, as the story goes, aspiring actresses had to trade sexual favors in order to win roles—has been a familiar image in Hollywood since the advent of the studio system in the 1920s and ’30s. Over time, the phrase has become emblematic of the way that sexual aggression has been normalized in an industry dominated by powerful men.