Yes, this is a striking stat. But it doesn't tell us that college is losing its value. It tells us that more people are going to college -- and not enough are finishing.
Everybody is looking for the next big "bubble". Maybe it's bonds. Or tech stocks. Or ... college? With tuition soaring and job prospects not, a growing chorus thinks higher education might just be too big not to fail. The calculus is simple. If college costs keep rising, but job prospects don't improve, eventually higher education won't be worth it. Pop goes the campus bubble -- or so the story goes.
That brings us to one of the more inauspicious recent headlines. For the first time ever, the majority of the unemployed have attended some college. Does this mark some kind of inflection point? Is it time to ditch the classroom for the office? Not exactly.
First, the gory details. The chart below from Business Insider shows the twenty-year educational trend among the jobless. (Remember: This shows what percentage of the jobless have ever set foot on a college campus -- or not. It doesn't show what percentage of high school grads or college enrollees are out of work).
This is not as bad as it looks, and it doesn't mean what you might think.
Here are the three numbers that tell us why: 7.9, 7.6 and 4.0. Those are the unemployment rates among people 25 and older for high school grads, for college dropouts, and for college graduates -- all courtesy of the Bureau of Labor Statistics.
The chart above isn't a story about a college degree no longer paying off. The chart above is a story about more people going to college, but not nearly as many more people finishing college. As my colleague Jordan Weissmann recently pointed out, only 56 percent of those who start on a bachelor's degree finish within six years. Only 29 percent of those who start on a associate's degree finish within three years. And consider that this is happening while college enrollment is at an all-time high. Too many students are getting the worst of both worlds: debt without a degree. Their finances get worse, but their job prospects don't get much better. That's how we get a world where most of the unemployed have attended at least some college.
But there's something of a chicken-and-egg problem here. More students would finish school if they could afford it. That's certainly not the only reason our college dropout rate is so high, but it's certainly one of the reasons.
In other words, the high cost of college is disguising the payoff of college. There still aren't many better long-term investments than a college degree. Graduates have lower unemployment. They earn more. And the gap between what college and high school graduates make is only growing. But you know what they say about the long-run. It can be awfully hard to get there when the short-run costs are so high. That's why reining in college tuition is so critical. It will both help young graduates struggling with the terrible economy, but also help more people become young graduates.
Of course, it's not obvious how we can do this. If we knew, we'd be doing it. But it's worth remembering: That's how you win the future.
In 12 of 16 past cases in which a rising power has confronted a ruling power, the result has been bloodshed.
When Barack Obama meets this week with Xi Jinping during the Chinese president’s first state visit to America, one item probably won’t be on their agenda: the possibility that the United States and China could find themselves at war in the next decade. In policy circles, this appears as unlikely as it would be unwise.
And yet 100 years on, World War I offers a sobering reminder of man’s capacity for folly. When we say that war is “inconceivable,” is this a statement about what is possible in the world—or only about what our limited minds can conceive? In 1914, few could imagine slaughter on a scale that demanded a new category: world war. When war ended four years later, Europe lay in ruins: the kaiser gone, the Austro-Hungarian Empire dissolved, the Russian tsar overthrown by the Bolsheviks, France bled for a generation, and England shorn of its youth and treasure. A millennium in which Europe had been the political center of the world came to a crashing halt.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
A child psychologist argues punishment is a waste of time when trying to eliminate problem behavior. Try this instead.
Say you have a problem child. If it’s a toddler, maybe he smacks his siblings. Or she refuses to put on her shoes as the clock ticks down to your morning meeting at work. If it’s a teenager, maybe he peppers you with obscenities during your all-too-frequent arguments. The answer is to punish them, right?
Not so, says Alan Kazdin, director of the Yale Parenting Center. Punishment might make you feel better, but it won’t change the kid’s behavior. Instead, he advocates for a radical technique in which parents positively reinforce the behavior they do want to see until the negative behavior eventually goes away.
As I was reporting my recent series about child abuse, I came to realize that parents fall roughly into three categories. There’s a small number who seem intuitively to do everything perfectly: Moms and dads with chore charts that actually work and snack-sized bags of organic baby carrots at the ready. There’s an even smaller number who are horrifically abusive to their kids. But the biggest chunk by far are parents in the middle. They’re far from abusive, but they aren’t super-parents, either. They’re busy and stressed, so they’re too lenient one day and too harsh the next. They have outdated or no knowledge of child psychology, and they’re scrambling to figure it all out.
Life in Ohio's proud but economically abandoned small towns
Just over a decade ago, Matt Eich started photographing rural Ohio. Largely inhabited by what is now known as the “Forgotten Class” of white, blue-collar workers, Eich found himself drawn to the proud but economically abandoned small towns of Appalachia. Thanks to grants from the Economic Hardship Reporting Project and Getty Images, Eich was able to capture the family life, drug abuse, poverty, and listlessness of these communities. “Long before Trump was a player on the political scene, long before he was a Republican, these people existed and these problems existed,” Eich said. His new book, Carry Me Ohio, published by Sturm and Drang, is a collection of these images and the first of four books he plans to publish as part of The Invisible Yoke, a photographic meditation on the American condition. Even with a deep knowledge of the region, Eich was unprepared for the fury and energy that surrounded the election this year. “The anger is overpowering,” he said. “I knew what was going on, and I’m still surprised. I should have listened to the pictures.”
Universities themselves may be contributing to burnout.
With half of all doctoral students leaving graduate school without finishing, something significant and overwhelming must be happening for at least some of them during the process of obtaining that degree. Mental illness is often offered as the standard rationale to explain why some graduate students burn out. Some research has suggested a link between intelligence and conditions such as bipolar disorder, leading some observers to believe many graduate students struggle with mental-health problems that predispose them to burning out.
But such research is debatable, and surely not every student who drops out has a history of mental illness. So, what compels students to abandon their path to a Ph.D.? Could there be other underlying factors, perhaps environmental, that can cause an otherwise-mentally-healthy graduate student to become anxious, depressed, suicidal, or, in rare cases, violent?
President-elect Donald Trump has committed a sharp breach of protocol—one that underscores just how weird some important protocols are.
Updated on December 2 at 7:49 p.m.
It’s hardly remembered now, having been overshadowed a few months later on September 11, but the George W. Bush administration’s first foreign-policy crisis came in the South China Sea. On April 1, 2001, a U.S. Navy surveillance plane collided with a Chinese jet near Hainan Island. The pilot of the Chinese jet was killed, and the American plane was forced to land and its crew was held hostage for 11 days, until a diplomatic agreement was worked out. Sino-American relations remained tense for some time.
Unlike Bush, Donald Trump didn’t need to wait to be inaugurated to set off a crisis in the relationship. He managed that on Friday, with a phone call to the president of Taiwan, Tsai Ing-wen. It’s a sharp breach with protocol, but it’s also just the sort that underscores how weird and incomprehensible some important protocols are.
Switching pastas and breads is a small decision that could save lives.
Multigrain is a genius approach to selling both white bread and righteousness. The term crept under the umbrella of health quietly. It wasn’t clear why, exactly. (The grain part? Or the multi?) At least it wasn’t white bread, right?
As many eaters of bread came to understand that white bread is a nutritional equivalent of Pixy Stix—the nutritious, fibrous shell of the wheat having been removed, leaving us with only the inner starch, which our bodies almost instantly turn into sugar—it needed some rebranding.
Multigrain is now often used to imply wholesomeness, a virtue to which it often has no claim. Containing the flour of multiple grains does not mean containing whole grains. When millers leave the grain intact before milling, this is whole grain flour. It contains fiber, appeasing the pancreas and microbes that demand it for optimal performance. So, the term to look for is 100 percent whole wheat. (Or wholegrain, though the grain is usually wheat.)
Comedy-drama series like Fleabag and Transparent show how vulnerability is as important as unlikeability and strength when it comes to portraying fictional women.
In the first episode of the HBO series Enlightened, the show’s heroine, Amy Jellicoe, learns that she’s been fired. She does not take the news well. Within minutes, she goes from pitiable victim, sobbing abjectly in a bathroom stall, to mascara-streaked fury. “Go back to your sad, fucking, little desk,” she sneers at her assistant before tracking her ex-lover and presumed betrayer to the office lobby. “I will destroy you—I will bury you—I will kill you, motherfucker!” she screams at him through the elevator doors that she somehow, in a feat of desperation, manages to pry open.
Though the scene aired five years ago, it’s still a pretty radical few minutes of television, and not just because of the ferocity of Laura Dern’s performance. What feels most striking is the series’ willingness to dramatize an extended scene of female distressfor something other than a moralizing end. In this sense, Enlightened anticipates the Amazon series, Fleabag, which evinces a similar empathy toward a female character in the grip of powerfully negative emotions: anger, sadness, grief, self-doubt, shame. It’s probably no accident the two shows have almost identical promotional stills—close-ups of their protagonist’s makeup-smudged faces, staring directly to camera. Like a number of other female-centric, female-created tragicomedies to have emerged on TV in recent years—Transparent, Girls, Catastrophe, Insecure—the series also share a commitment to more compassionate portrayals of dysfunctional heroines, by suspending judgment even (or especially) when they’re at their worst.
A few weeks ago, I was trying to call Cuba. I got an error message—which, okay, international telephone codes are long and my fingers are clumsy—but the phone oddly started dialing again before I could hang up. A voice answered. It had a British accent and it was reading: “...the moon was shining brightly. The Martians had taken away the excavating-machine…”
Apparently, I had somehow called into an audiobook of The War of the Worlds. Suspicious of my clumsy fingers, I double-checked the number. It was correct (weird), but I tried the number again, figuring that at worst, I’d learn what happened after the Martians took away the excavating machine. This time, I got the initial error message and the call disconnected. No Martians.
A century ago, millions of Americans banded together in defense of white, Christian America and traditional morality—and most of their compatriots turned a blind eye to the Ku Klux Klan.
On August 8, 1925, more than 50,000 members of the Ku Klux Klan paraded through Washington, D.C. Some walked in lines as wide as 20 abreast, while others created formations of the letter K or a Christian cross. A few rode on horseback. Many held American flags. Men and women alike, the marchers carried banners emblazoned with the names of their home states or local chapters, and their procession lasted for more than three hours down a Pennsylvania Avenue lined with spectators. National leaders of the organization were resplendent in colorful satin robes and the rank and file wore white, their regalia adorned with a circular red patch containing a cross with a drop of blood at its center.
Nearly all of the marchers wore pointed hoods, but their faces were clearly visible. In part, that was because officials would sanction the parade only if participants agreed to walk unmasked. But a mask was not really necessary, as most members of the Klan saw little reason to hide their faces. After all, there were millions of them in the United States.