With the release of this month's unemployment report, we now have a chance to take full stock of what happened to the U.S. job market in 2011. In this politically tumultuous year, employment crawled upwards. Slowly.
Overall, total non-farm employment inched higher by roughly 1.6 million jobs, or about 1.3%. The private sector grew modestly. The public sector shrank, also modestly. The United States economy is still about 6 million jobs short of where it was before the beginning of the Great Recession. And while the unemployment rate is down to 8.5% from 9.4%, it's partly because so many workers have given up on job hunting.
That's the Cliff's Notes version. Beneath the headline figures, America's employment picture is vastly more complicated. If you were a white, or college educated, or in the oil business, odds are you had a fabulous year. For African Americans, high school drop-outs, teachers, and 19-year-olds looking for work, the numbers told a very different story.
A Great Year For Oil Workers, A Terrible Year for Teachers
In 2011, the fastest growing industry sector by employment was mining. By a longshot. Jobs in logging and mining as a combined sector increased by 12.4%, but virtually all of that growth was due to mining -- coal, oil, and gas extraction, as well as the support activities around them. Thank the oil boom in North Dakota and the hunt for natural gas in Appalachia's shale deposits. As you can see in the graph below, no other major industry saw even close to that rate of growth.
But while mining's growth was dramatic, it only contributed a small piece to 2011's overall employment bump -- about 91,000 new hires. The largest boost came from business services, a hodge-podge category encompassing a wide variety of white collar employees. Its growth was powered by increased demand for highly educated workers such as engineers and architects, computer systems designers, and accountants. Administrative support positions, including roughly 90,000 new workers in temp agencies, also made up much of the growth. Other important pieces of the job growth puzzle included health care and social assistance, which added 350,000 workers, and the hospitality businesses, which added 230,000 workers in food services alone.
It's part of an evolving split in the American workforce: On the one hand, we're growing high-skilled jobs in offices and hospitals. On the other, we're producing low-wage service jobs. There's not a ton being created in the middle. Even this year's manufacturing growth only reclaimed a small portion of the millions of factory jobs lost to the economic downturn.
The gloomiest portion of this chart, however, is reserved for government hiring. In a year without the cushion of stimulus spending, local, state, and -- yes -- federal government employment rolls all shrank, shedding a total of 280,000 workers. Public schools let go 113,000 workers alone. To put that in perspective, the loss of government jobs eclipsed the entire growth of manufacturing and construction combined.
A Bad Time to Be Young, or Without A College Degree
More than their industry, however, the most important factor affecting workers ability to get hired in 2011 was their education. At Slate, Matt Yglesias posted this chart showing that more than half of the jobs added went to Americans with a college education. High school graduates, meanwhile, lost half a million jobs.
Beyond education, the next great divide in 2011 remained age. For women and men over the age of 20, the unemployment rate was about 8%. For those aged 16 to 19, the unemployment rate was 23.1%, down from 25.2% a year ago. For black youth, the unemployment rate was a staggering 44%, down from 42% a year before.
Overall African American unemployment refused to budge during the year, staying at exactly 15.8%. The slimming of government payrolls may be the major culprit since, as the New York Timeshas reported, one in five black workers is a public sector employee. Whites and Hispanics, meanwhile, saw unemployment drop from 8.5% to 7.5% and from 11.0% from 12.9%, respectively.
The jobs numbers in 2011 weren't spectacular for your group, no matter where you fit into the jobs picture. But your age, education, and industry made a huge difference.
The First Lady took to the stage at the Democratic National Convention, and united a divided hall.
Most convention speeches are forgotten almost before they’re finished. But tonight in Philadelphia, Michelle Obama delivered a speech that will be replayed, quoted, and anthologized for years. It was as pure a piece of political oratory as this campaign has offered, and instantly entered the pantheon of great convention speeches.
Obama stepped out onto a stage in front of a divided party, including delegates who had booed almost every mention of the presumptive nominee. And she delivered a speech that united the hall, bringing it to its feet.
She did it, moreover, her own way—forming a striking contrast with the night’s other speakers. She did it without shouting at the crowd. Without overtly slamming Republicans. Without turning explicitly negative. Her speech was laden with sharp barbs, but she delivered them calmly, sometimes wryly, biting her lower lip, hitting her cadence. It was a masterful performance.
When something goes wrong, I start with blunder, confusion, and miscalculation as the likely explanations. Planned-out wrongdoing is harder to pull off, more likely to backfire, and thus less probable.
But it is getting more difficult to dismiss the apparent Russian role in the DNC hack as blunder and confusion rather than plan.
“Real-world” authorities, from the former U.S. Ambassador to Russia to FBI sources to international security experts, say that the forensic evidence indicates the Russians. No independent authority strongly suggests otherwise. (Update the veteran reporters Shane Harris and Nancy Youssef cite evidence that the original hacker was “an agent of the Russian government.”)
The timing and precision of the leaks, on the day before the Democratic convention and on a topic intended to maximize divisions at that convention, is unlikely to be pure coincidence. If it were coincidence, why exactly now, with evidence drawn from hacks over previous months? Why mail only from the DNC, among all the organizations that have doubtless been hacked?
The foreign country most enthusiastic about Trump’s rise appears to be Russia, which would also be the foreign country most benefited by his policy changes, from his sowing doubts about NATO and the EU to his weakening of the RNC platform language about Ukraine.
For the party elders, day one of the convention was about scolding the left back together.
Against a restive backdrop, the party’s top lieutenants were forced into the role of prime time peacemakers, tasked with encouraging Democratic unity in a party that has only lately acquiesced to tenuous detente. They did so through a combination of alarmist truth telling—borne from the reality of a Trump-Clinton matchup that has lately gotten tighter—and cold-water scolding about party division—driven equally by frustration and exhaustion.
The pressures of national academic standards have pushed character education out of the classroom.
A few months ago, I presented the following scenario to my junior English students: Your boyfriend or girlfriend has committed a felony, during which other people were badly harmed. Should you or should you not turn him or her into the police?
The class immediately erupted with commentary. It was obvious, they said, that loyalty was paramount—not a single student said they’d “snitch.” They were unequivocally unconcerned about who was harmed in this hypothetical scenario. This troubled me.
This discussion was part of an introduction to an essay assignment about whether Americans should pay more for ethically produced food. We continued discussing other dilemmas, and the kids were more engaged that they’d been in weeks, grappling with big questions about values, character, and right versus wrong as I attempted to expand their thinking about who and what is affected—and why it matters—by their caloric choices.
The Democratic chairwoman had few supporters—but clung to her post for years, abetted by the indifference of the White House.
PHILADELPHIA—As Debbie Wasserman Schultz made her unceremonious exit as chairwoman of the Democratic National Committee, what was most remarkable was what you didn’t hear: practically anybody coming to her defense.
The Florida congresswoman did not go quietly. She reportedly resisted stepping down, and blamed subordinates for the content of the leaked emails that were released Friday, which clearly showed the committee’s posture of neutrality in the Democratic primary to have been a hollow pretense, just as Bernie Sanders and his supporters long contended. She finally relinquished the convention gavel only after receiving three days of strong-arming, a ceremonial position in the Clinton campaign, and a raucous round of boos at a convention breakfast.
Psychologists have long debated how flexible someone’s “true” self is.
Almost everyone has something they want to change about their personality. In 2014, a study that traced people’s goals for personality change found that the vast majority of its subjects wanted to be more extraverted, agreeable, emotionally stable, and open to new experiences. A whopping 97 percent said they wished they were more conscientious.
These desires appeared to be rooted in dissatisfaction. People wanted to become more extraverted if they weren’t happy with their sex lives, hobbies, or friendships. They wanted to become more conscientious if they were displeased with their finances or schoolwork. The findings reflect the social psychologist Roy Baumeister’s notion of “crystallization of discontent”: Once people begin to recognize larger patterns of shortcomings in their lives, he contends, they may reshuffle their core values and priorities to justify improving things.
Two new novels ponder the still-urgent question of what could have compelled young women to do such terrible things.
The most fascinating part of the Manson story has always been the girls.
Not the man who cobbled together bits of hippie philosophy, Scientology and How to Win Friends and Influence People to gather followers who’d do his bidding and help make him a star (and when that didn’t work out, kill people to try to start a race war). The ones willing and vulnerable enough to be gathered. Who wanted a community to belong to.
Even now, no one knows whether Charles Manson believed his own insane manifesto, or was just using it as a tool to get what he wanted. But the girls believed. Patricia Krenwinkel, Leslie Van Houten, Susan Atkins—they believed. They belonged. And then, on two infamous evenings in 1969, they helped kill seven people.
Stock-market crashes, terrorist attacks, and the dark side of “newsworthy” stories
Man bites dog. It is one of the oldest cliches in journalism, an acknowledgement of the idea that ordinary events are not newsworthy, whereas oddities, like a puppy-nibbling adult, deserve disproportionate coverage.
The rule is straightforward, but its implications are subtle. If journalists are encouraged to report extreme events, they guide both elite and public attitudes, leading many people, including experts, to feel like extreme events are more common than they actually are. By reporting on only the radically novel, the press can feed a popular illusion that the world is more terrible than it actually is.
Take finance, for example. Professional investors are fretting about the possibility of a massive stock-market crash, on par with 1987’s Black Monday. The statistical odds that such an event will occur within the next six months are about 1-in-60, according to historical data from 1929 to 1988. But when surveys between 1989 and 2015 asked investors to estimate the odds of such a crash in the coming months, the typical response was 1-in-10.
Physicists can’t agree on whether the flow of future to past is real or a mental construct.
Einstein once described his friend Michele Besso as “the best sounding board in Europe” for scientific ideas. They attended university together in Zurich; later they were colleagues at the patent office in Bern. When Besso died in the spring of 1955, Einstein—knowing that his own time was also running out—wrote a now-famous letter to Besso’s family. “Now he has departed this strange world a little ahead of me,” Einstein wrote of his friend’s passing. “That signifies nothing. For us believing physicists, the distinction between past, present, and future is only a stubbornly persistent illusion.”
Einstein’s statement was not merely an attempt at consolation. Many physicists argue that Einstein’s position is implied by the two pillars of modern physics: Einstein’s masterpiece, the general theory of relativity, and the Standard Model of particle physics. The laws that underlie these theories are time-symmetric—that is, the physics they describe is the same, regardless of whether the variable called “time” increases or decreases. Moreover, they say nothing at all about the point we call “now”—a special moment (or so it appears) for us, but seemingly undefined when we talk about the universe at large. The resulting timeless cosmos is sometimes called a “block universe”—a static block of space-time in which any flow of time, or passage through it, must presumably be a mental construct or other illusion.