With the release of this month's unemployment report, we now have a chance to take full stock of what happened to the U.S. job market in 2011. In this politically tumultuous year, employment crawled upwards. Slowly.
Overall, total non-farm employment inched higher by roughly 1.6 million jobs, or about 1.3%. The private sector grew modestly. The public sector shrank, also modestly. The United States economy is still about 6 million jobs short of where it was before the beginning of the Great Recession. And while the unemployment rate is down to 8.5% from 9.4%, it's partly because so many workers have given up on job hunting.
That's the Cliff's Notes version. Beneath the headline figures, America's employment picture is vastly more complicated. If you were a white, or college educated, or in the oil business, odds are you had a fabulous year. For African Americans, high school drop-outs, teachers, and 19-year-olds looking for work, the numbers told a very different story.
A Great Year For Oil Workers, A Terrible Year for Teachers
In 2011, the fastest growing industry sector by employment was mining. By a longshot. Jobs in logging and mining as a combined sector increased by 12.4%, but virtually all of that growth was due to mining -- coal, oil, and gas extraction, as well as the support activities around them. Thank the oil boom in North Dakota and the hunt for natural gas in Appalachia's shale deposits. As you can see in the graph below, no other major industry saw even close to that rate of growth.
But while mining's growth was dramatic, it only contributed a small piece to 2011's overall employment bump -- about 91,000 new hires. The largest boost came from business services, a hodge-podge category encompassing a wide variety of white collar employees. Its growth was powered by increased demand for highly educated workers such as engineers and architects, computer systems designers, and accountants. Administrative support positions, including roughly 90,000 new workers in temp agencies, also made up much of the growth. Other important pieces of the job growth puzzle included health care and social assistance, which added 350,000 workers, and the hospitality businesses, which added 230,000 workers in food services alone.
It's part of an evolving split in the American workforce: On the one hand, we're growing high-skilled jobs in offices and hospitals. On the other, we're producing low-wage service jobs. There's not a ton being created in the middle. Even this year's manufacturing growth only reclaimed a small portion of the millions of factory jobs lost to the economic downturn.
The gloomiest portion of this chart, however, is reserved for government hiring. In a year without the cushion of stimulus spending, local, state, and -- yes -- federal government employment rolls all shrank, shedding a total of 280,000 workers. Public schools let go 113,000 workers alone. To put that in perspective, the loss of government jobs eclipsed the entire growth of manufacturing and construction combined.
A Bad Time to Be Young, or Without A College Degree
More than their industry, however, the most important factor affecting workers ability to get hired in 2011 was their education. At Slate, Matt Yglesias posted this chart showing that more than half of the jobs added went to Americans with a college education. High school graduates, meanwhile, lost half a million jobs.
Beyond education, the next great divide in 2011 remained age. For women and men over the age of 20, the unemployment rate was about 8%. For those aged 16 to 19, the unemployment rate was 23.1%, down from 25.2% a year ago. For black youth, the unemployment rate was a staggering 44%, down from 42% a year before.
Overall African American unemployment refused to budge during the year, staying at exactly 15.8%. The slimming of government payrolls may be the major culprit since, as the New York Timeshas reported, one in five black workers is a public sector employee. Whites and Hispanics, meanwhile, saw unemployment drop from 8.5% to 7.5% and from 11.0% from 12.9%, respectively.
The jobs numbers in 2011 weren't spectacular for your group, no matter where you fit into the jobs picture. But your age, education, and industry made a huge difference.
Most of management theory is inane, writes our correspondent, the founder of a consulting firm. If you want to succeed in business, don’t get an M.B.A. Study philosophy instead
During the seven years that I worked as a management consultant, I spent a lot of time trying to look older than I was. I became pretty good at furrowing my brow and putting on somber expressions. Those who saw through my disguise assumed I made up for my youth with a fabulous education in management. They were wrong about that. I don’t have an M.B.A. I have a doctoral degree in philosophy—nineteenth-century German philosophy, to be precise. Before I took a job telling managers of large corporations things that they arguably should have known already, my work experience was limited to part-time gigs tutoring surly undergraduates in the ways of Hegel and Nietzsche and to a handful of summer jobs, mostly in the less appetizing ends of the fast-food industry.
How “engagement” made the web a less engaging place
Here’s a little parable. A friend of mine was so enamored of Google Reader that he built a clone when it died. It was just like the original, except that you could add pictures to your posts, and you could Like comments. The original Reader was dominated by conversation, much of it thoughtful and earnest. The clone was dominated by GIFs and people trying to be funny.
I actually built my own Google Reader clone. (That’s part of the reason this friend and I became friends—we both loved Reader that much.) But my version was more conservative: I never added any Like buttons, and I made it difficult to add pictures to comments. In fact, it’s so hard that I don’t think there has ever been a GIF on the site.
Donald Trump flaunted his elastic conception of truth in an interview with Time—but he may yet learn that facts are stubborn things.
How can anyone convince the most powerful man in the world of something he does not wish to believe?
It’s not an idle question. In a remarkable interview with Time’s Michael Scherer, President Trump flaunted his elastic relationship with truth. Instead of weighing evidence, he explained, he prefers to trust his gut. “I’m a very instinctual person,” he said, “but my instinct turns out to be right.”
Trump unrepentantly rehearsed his litany of false or unsubstantiated claims with Scherer. Was Ted Cruz’s father linked to Lee Harvey Oswald? “Why do you say that I have to apologize? I’m just quoting the newspaper.” (The newspaper in question is the National Enquirer.) Had the president tapped his phones? “A lot of information has just been learned, and a lot of information may be learned over the next coming period of time. We will see what happens.” Were there 3 million fraudulent votes cast in 2016? “Well I think I will be proved right about that too.”
At the president’s behest, House Republicans will render what might be a final verdict on the Affordable Care Act in a high-stakes vote on Friday.
On Thursday, the Affordable Care Act celebrated its seventh birthday. On Friday, it just might celebrate a most unlikely reprieve.
In a take-it-or-leave-it message delivered by his senior advisers to Capitol Hill, President Trump late Thursday told bickering House Republicans they had one final opportunity to repeal and replace the health-care law they have decried since its enactment. At the president’s behest, Speaker Paul Ryan on Friday will call a vote on the American Health Care Act and dare recalcitrant conservatives to defeat it. If the bill fails, Trump plans to keep Obamacare in place and move on with other parts of his agenda—a move that would enrage conservative activists while conceding an enormous defeat for the new administration.
Party leaders postponed a House vote Thursday after President Trump and Speaker Paul Ryan failed to win enough support.
Updated on March 23 at 4:28 p.m. ET
Lacking the majority needed to pass their bill to replace the Affordable Care Act, House Republican leaders have postponed a planned Thursday vote, imperiling President Trump’s first major legislative priority.
The move was an indication that a series of meetings Trump and Speaker Paul Ryan had with reluctant members in the party’s conservative and centrist wings had failed to achieve a consensus. Members of the House Freedom Caucus left a meeting with the president early in the afternoon saying there was “no deal” as they pushed Ryan to move the bill further to the right. And for Trump and Ryan, the delay dashed their hope of voting to dismantle the law on the seventh anniversary of its signing by former President Barack Obama.
Two Princeton economists elaborate on their work exploring rising mortality rates among certain demographics.
Two years ago, the Princeton economists Anne Case and Angus Deaton published an alarming revelation: Middle-aged white Americans without a college degree were dying in greater numbers, even as people in other developed countries were living longer. The husband-and-wife team argued, in a study in the Proceedings of the National Academy of Sciences, that these white Americans are facing“deaths of despair”—suicide, overdoses from alcohol and drug, and alcohol-related liver disease.
The paper caused a stir in academic circles and in the media, and has remained in the public discourse following Donald Trump’s win partly on the strength of his support from these same middle-aged white Americans (the alive ones, to be clear). The paper, however, couldn’t answer the question everyone had: Why was this demographic in particular struggling? It couldn’t be purely the economic pain they faced in the wake of globalization; after all, European countries are also affected by globalization, and their residents are getting healthier and living longer. And non-whites in the U.S. are living longer than they used to as well, and they are subject to the same economic forces as middle-age whites and are struggling, at least in economic terms, even more.
The commander in chief embraces a peculiar worldview in which bogus claims are retroactively justified and evidence simply conjured into existence.
President Trump remains peculiarly fixated on the cover of Time magazine. He has claimed in the past that he holds the record for most covers, but in an interview with Michael Scherer for this week’s magazine, the president asked if he was the all-time leader. Scherer had to break the bad news to him: Richard M. Nixon still held the lead—though he added, “He was in office for longer, so give yourself time.” “Ok, good. I’m sure I’ll win,” Trump replied.
The exchange is full of intrigue. Neither man noted that though Nixon was elected to two terms, his presidency was foreshortened by paranoia and lawbreaking. Nor did they note the increasingly frequent comparisons between Nixon’s terminal scandal and Trump’s own difficulties. But in the course of an interview about Trump’s extremely distant relationship with the truth—from obvious lies to head-scratching speculation—the president offered Nixonian maxim of his own.
The philosophers he influenced set the stage for the technological revolution that remade our world.
THE HISTORY Ofcomputers is often told as a history of objects, from the abacus to the Babbage engine up through the code-breaking machines of World War II. In fact, it is better understood as a history of ideas, mainly ideas that emerged from mathematical logic, an obscure and cult-like discipline that first developed in the 19th century. Mathematical logic was pioneered by philosopher-mathematicians, most notably George Boole and Gottlob Frege, who were themselves inspired by Leibniz’s dream of a universal “concept language,” and the ancient logical system of Aristotle.
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.
Mathematical logic was initially considered a hopelessly abstract subject with no conceivable applications. As one computer scientist commented: “If, in 1901, a talented and sympathetic outsider had been called upon to survey the sciences and name the branch which would be least fruitful in [the] century ahead, his choice might well have settled upon mathematical logic.” And yet, it would provide the foundation for a field that would have more impact on the modern world than any other.
New research on the creatures’ family tree could “shake dinosaur paleontology to its core.”
When I first read Matthew Baron’s new dinosaur study, I actually gasped.
For most of my life, I’ve believed that the dinosaurs fell into two major groups: the lizard-hipped saurischians, which included the meat-eating theropods like Tyrannosaurus and long-necked sauropodomorphs like BrontosaurusYes, Brontosaurus. It’s a thing again. ; and the bird-hipped ornithischians, which included horned species like Triceratops and armored ones like Stegosaurus. That’s how dinosaurs have been divided since 1887. It’s what I learned as a kid. It’s what all the textbooks and museums have always said. And according to Baron, a Ph.D. student at the University of Cambridge, it’s wrong.
By thoroughly comparing 74 early dinosaurs and their relatives, Baron has radically redrawn the two major branches of the dinosaur family tree. Defying 130 years of accepted dogma, he splits the saurischians apart, leaving the sauropods in one branch, and placing the theropods with the ornthischians on the other. Put it this way: This is like someone telling you that neither cats nor dogs are what you thought they were, and some of the animals you call “cats” are actually dogs.
Trump promised to revitalize the blighted heartland. His policies will punish them.
President Donald Trump might be consumed by half-truths and conspiracy theories, but during the campaign he brought attention to a very real phenomenon: regional inequality. He promised not only a proper swamp-draining in Washington, D.C., but also a renaissance for the Rust Belt, Appalachia, and America’s blighted heartland.
Even when his prognoses were fantasies—neither trade wars nor border walls will ever bring back 1950s-level manufacturing employment—the underlying diagnosis was pretty much right. For much of the 20th century, productivity in America’s poorest regions actually grew faster than in rich metros. But decades of convergence have come to a screeching halt in the 2000s. Rich coastal cities have left the rest of the country behind. In 1980, the typical New York City worker earned 80 percent more than the national average. By 2013, he earned 172 percent more.