With the release of this month's unemployment report, we now have a chance to take full stock of what happened to the U.S. job market in 2011. In this politically tumultuous year, employment crawled upwards. Slowly.
Overall, total non-farm employment inched higher by roughly 1.6 million jobs, or about 1.3%. The private sector grew modestly. The public sector shrank, also modestly. The United States economy is still about 6 million jobs short of where it was before the beginning of the Great Recession. And while the unemployment rate is down to 8.5% from 9.4%, it's partly because so many workers have given up on job hunting.
That's the Cliff's Notes version. Beneath the headline figures, America's employment picture is vastly more complicated. If you were a white, or college educated, or in the oil business, odds are you had a fabulous year. For African Americans, high school drop-outs, teachers, and 19-year-olds looking for work, the numbers told a very different story.
A Great Year For Oil Workers, A Terrible Year for Teachers
In 2011, the fastest growing industry sector by employment was mining. By a longshot. Jobs in logging and mining as a combined sector increased by 12.4%, but virtually all of that growth was due to mining -- coal, oil, and gas extraction, as well as the support activities around them. Thank the oil boom in North Dakota and the hunt for natural gas in Appalachia's shale deposits. As you can see in the graph below, no other major industry saw even close to that rate of growth.
But while mining's growth was dramatic, it only contributed a small piece to 2011's overall employment bump -- about 91,000 new hires. The largest boost came from business services, a hodge-podge category encompassing a wide variety of white collar employees. Its growth was powered by increased demand for highly educated workers such as engineers and architects, computer systems designers, and accountants. Administrative support positions, including roughly 90,000 new workers in temp agencies, also made up much of the growth. Other important pieces of the job growth puzzle included health care and social assistance, which added 350,000 workers, and the hospitality businesses, which added 230,000 workers in food services alone.
It's part of an evolving split in the American workforce: On the one hand, we're growing high-skilled jobs in offices and hospitals. On the other, we're producing low-wage service jobs. There's not a ton being created in the middle. Even this year's manufacturing growth only reclaimed a small portion of the millions of factory jobs lost to the economic downturn.
The gloomiest portion of this chart, however, is reserved for government hiring. In a year without the cushion of stimulus spending, local, state, and -- yes -- federal government employment rolls all shrank, shedding a total of 280,000 workers. Public schools let go 113,000 workers alone. To put that in perspective, the loss of government jobs eclipsed the entire growth of manufacturing and construction combined.
A Bad Time to Be Young, or Without A College Degree
More than their industry, however, the most important factor affecting workers ability to get hired in 2011 was their education. At Slate, Matt Yglesias posted this chart showing that more than half of the jobs added went to Americans with a college education. High school graduates, meanwhile, lost half a million jobs.
Beyond education, the next great divide in 2011 remained age. For women and men over the age of 20, the unemployment rate was about 8%. For those aged 16 to 19, the unemployment rate was 23.1%, down from 25.2% a year ago. For black youth, the unemployment rate was a staggering 44%, down from 42% a year before.
Overall African American unemployment refused to budge during the year, staying at exactly 15.8%. The slimming of government payrolls may be the major culprit since, as the New York Timeshas reported, one in five black workers is a public sector employee. Whites and Hispanics, meanwhile, saw unemployment drop from 8.5% to 7.5% and from 11.0% from 12.9%, respectively.
The jobs numbers in 2011 weren't spectacular for your group, no matter where you fit into the jobs picture. But your age, education, and industry made a huge difference.
Dean of Students John Ellison gets an A for initiative, a B-minus for execution, and extra-credit for stoking a useful debate.
When I was a heretical student at a Catholic high school deciding where to apply to college, I thrilled at the prospect of an educational institution where free inquiry would reign supreme and forceful debate would never be hemmed in by dogma.
A letter like the one that University of Chicago Dean of Students John Ellison sent last week to incoming first-year students––reminding them of the school’s “commitment to freedom of inquiry and expression," and affirming that those admitted to it “are encouraged to speak, write, listen, challenge, and learn, without fear of censorship”––would have struck me as a glorious affirmation: that robust intellectual communities truly did exist; that I would finally be free to follow my brain; that college would be a crucible that tested the strength of all my beliefs.
The San Francisco quarterback has been attacked for refusing to stand for the Star Spangled Banner—and for daring to criticize the system in which he thrived.
It was in early childhood when W.E.B. Du Bois––scholar, activist, and black radical––first noticed The Veil that separated him from his white classmates in the mostly white town of Great Barrington, Massachusetts. He and his classmates were exchanging “visiting cards,” invitations to visit one another’s homes, when a white girl refused his.
“Then it dawned upon me with a certain suddenness that I was different from the others; or like, mayhap, in heart and life and longing, but shut out from their world by a vast veil. I had thereafter no desire to tear down that veil, to creep through; I held all beyond it in common contempt, and lived above it in a region of blue sky and great wandering shadows,” Du Bois wrote in his acclaimed essay collection, The Souls of Black Folk. “That sky was bluest when I could beat my mates at examination-time, or beat them at a foot-race, or even beat their stringy heads.”
In its early days, the first English settlement in America had lots of men, tobacco, and land. All it needed was women.
“First comes love, then comes marriage,” the old nursery rhyme goes, but historically, first came money. Marriage was above all an economic transaction, and in no place was this more apparent than in the early 1600s in the Jamestown colony, where a severe gender imbalance threatened the fledgling colony’s future.
The men of Jamestown desperately wanted wives, but women were refusing to immigrate. They had heard disturbing reports of dissension, famine, and disease, and had decided it simply wasn’t worth it. Consequently, barely a decade after its founding in 1607, Jamestown was almost entirely male, and because these men were unable to find wives, they were deserting the colony in droves.
An immediate influx of women was needed to save the floundering colony; its leaders suggested putting out an advertisement targeting wives. The women who responded to this marital request and agreed to marry unknown men in an unfamiliar land were in a sense America’s first mail-order brides.
A Hillary Clinton presidential victory promises to usher in a new age of public misogyny.
Get ready for the era of The Bitch.
If Hillary Clinton wins the White House in November, it will be a historic moment, the smashing of the preeminent glass ceiling in American public life. A mere 240 years after this nation’s founding, a woman will occupy its top office. America’s daughters will at last have living, breathing, pantsuit-wearing proof that they too can grow up to be president.
A Clinton victory also promises to usher in four-to-eight years of the kind of down-and-dirty public misogyny you might expect from a stag party at Roger Ailes’s house.
You know it’s coming. As hyperpartisanship, grievance politics, and garden-variety rage shift from America’s first black commander-in-chief onto its first female one, so too will the focus of political bigotry. Some of it will be driven by genuine gender grievance or discomfort among some at being led by a woman. But in plenty of other cases, slamming Hillary as a bitch, a c**t (Thanks, Scott Baio!), or a menopausal nut-job (an enduringly popular theme on Twitter) will simply be an easy-peasy shortcut for dismissing her and delegitimizing her presidency.
The talk-radio host claims that he never took Donald Trump seriously on immigration. He neglected to tell his immigration obsessed listeners.
For almost a decade, I’ve been angrily documenting the way that many right-wing talk-radio hosts betray the rank-and-file conservatives who trust them for information. My late grandmother was one of those people. She deserved better than she got. With huge platforms and massive audiences, successful hosts ought to take more care than the average person to be truthful and avoid misinforming listeners. Yet they are egregiously careless on some days and willfully misleading on others.
And that matters, as we’ll come to see.
Rush Limbaugh is easily the most consequential of these hosts. He has an audience of millions. And over the years, parts of the conservative movement that ought to know better, like the Claremont Institute, have treated him like an honorable conservative intellectual rather than an intellectually dishonest entertainer. The full cost of doing so became evident this year, when a faction of populists shaped by years of talk radio, Fox News, and Breitbart.com picked Donald Trump to lead the Republican Party, a choice that makes a Hillary Clinton victory likely and is a catastrophe for movement conservatism regardless of who wins.
Practices meant to protect marginalized communities can also ostracize those who disagree with them.
Last week, the University of Chicago’s dean of students sent a welcome letter to freshmen decrying trigger warnings and safe spaces—ways for students to be warned about and opt out of exposure to potentially challenging material. While some supported the school’s actions, arguing that these practices threaten free speech and the purpose of higher education, the note also led to widespread outrage, and understandably so. Considered in isolation, trigger warnings may seem straightforwardly good. Basic human decency means professors like myself should be aware of students’ traumatic experiences, and give them a heads up about course content—photographs of dead bodies, extended accounts of abuse, disordered eating, self-harm—that might trigger an anxiety attack and foreclose intellectual engagement. Similarly, it may seem silly to object to the creation of safe spaces on campus, where members of marginalized groups can count on meeting supportive conversation partners who empathize with their life experiences, and where they feel free to be themselves without the threat of judgment or censure.
How will the show maintain its charm while unraveling its mysteries?
Stranger Things will return in 2017 for a second season with nine episodes by original writers/directors Matt and Ross Duffer, Netflix announced today. The news is about as unsurprising as, say, the idea that four Dungeons and Dragons-playing nerds in 1983 would be bullied at school. But it’s also an intriguing development—not unlike the revelation of an alternate dimension that resembles our own but has unfriendly plant-headed monsters roaming about.
The first eight episodes of the nostalgia-soaked sci-fi saga became the unpredicted breakout pop-culture conversation piece of summer 2016, spawning memes online and faux funerals in real life. Netflix doesn’t reveal viewership numbers, but this week the independent data-measurement company Symphony Advanced Media estimated that the series drew an average of 14.07 million adults age 18-49 in the first 35 days of streaming. That would make it the second most-watched Netflix original of 2016, just behind Fuller House and the latest Orange Is the New Black season, both of which (unlike Stranger Things ) arrived with established fan bases. Netflix’s business model relies on shows doing exactly what Stranger Things has done: draw buzz to lure subscribers.
Which is a different way of asking: Can a bot commit libel?
Facebook set a new land-speed record for situational irony this week, as it fired the people who kept up its “Trending Topics” feature and replaced them with an algorithm on Friday, only to find the algorithm promoting completely fake news on Sunday.
Rarely in recent tech history has a downsizing decision come back to bite the company so publicly and so quickly.
Education experts offer their thoughts on how—if at all—schools should assign, grade, and use take-home assignments.
This is the third installment in our series about school in a perfect world. Read previous entries here and here.
We asked prominent voices in education—from policy makers and teachers to activists and parents—to look beyond laws, politics, and funding and imagine a utopian system of learning. They went back to the drawing board—and the chalkboard—to build an educational Garden of Eden. We’re publishing their answers to one question each day this week. Responses have been lightly edited for clarity and length.
Today’s assignment: The Homework. Will students have homework?
Rita Pin Ahrens, thedirector of education policy for the Southeast Asia Resource Action Center
Homework is absolutely necessary for students to demonstrate that they are able to independently process and apply their learning. But who says homework has to be the same as it has been? Homework might include pre-reading in preparation for what will be covered in class that day, independent research on a student-chosen topic that complements the class curriculum, experiential learning through a volunteer activity or field trip, or visiting a website and accomplishing a task on it. The structure will be left to the teachers to determine, as best fits the learning objective, and should be graded—whether by the teacher or student. Students will be held accountable for their homework and understand that it is an integral part of the learning process.
What looks at first glance like an opening up of possibilities is actually an attack on the human imagination.
You might not like what I’m about to say about the multiverse. But don’t worry; you’ve already had your revenge. If there are an infinite number of parallel universes, there will be any number of terrible dictatorships, places where life has become very difficult for people who like to string words together. Somewhere out there, there’s a society in which every desperate little essay like this one comes with a tiny, unremarkable button: push it, and the author will be immediately electrocuted to death.
Maybe your hate is more visceral—you already know I’ll die some day, but you want to see it happen; you need to see me groveling. You can if you want. Fly upwards from the plane of our solar system, keep on going, through the endless huddles of galaxies, never forgetting your purpose, until space and time run out altogether. Eventually you’ll find yourself in another universe, on a damp patch of grass and broken concrete, unwatched by whatever local gang or galactic empire rules the city rising in foggy shapes beyond the marshes. There, you’ll see a creature strangely similar to yourself, beating me to death with whatever bits of scrap are lying around.