Yes, this is a striking stat. But it doesn't tell us that college is losing its value. It tells us that more people are going to college -- and not enough are finishing.
Everybody is looking for the next big "bubble". Maybe it's bonds. Or tech stocks. Or ... college? With tuition soaring and job prospects not, a growing chorus thinks higher education might just be too big not to fail. The calculus is simple. If college costs keep rising, but job prospects don't improve, eventually higher education won't be worth it. Pop goes the campus bubble -- or so the story goes.
That brings us to one of the more inauspicious recent headlines. For the first time ever, the majority of the unemployed have attended some college. Does this mark some kind of inflection point? Is it time to ditch the classroom for the office? Not exactly.
First, the gory details. The chart below from Business Insider shows the twenty-year educational trend among the jobless. (Remember: This shows what percentage of the jobless have ever set foot on a college campus -- or not. It doesn't show what percentage of high school grads or college enrollees are out of work).
This is not as bad as it looks, and it doesn't mean what you might think.
Here are the three numbers that tell us why: 7.9, 7.6 and 4.0. Those are the unemployment rates among people 25 and older for high school grads, for college dropouts, and for college graduates -- all courtesy of the Bureau of Labor Statistics.
The chart above isn't a story about a college degree no longer paying off. The chart above is a story about more people going to college, but not nearly as many more people finishing college. As my colleague Jordan Weissmann recently pointed out, only 56 percent of those who start on a bachelor's degree finish within six years. Only 29 percent of those who start on a associate's degree finish within three years. And consider that this is happening while college enrollment is at an all-time high. Too many students are getting the worst of both worlds: debt without a degree. Their finances get worse, but their job prospects don't get much better. That's how we get a world where most of the unemployed have attended at least some college.
But there's something of a chicken-and-egg problem here. More students would finish school if they could afford it. That's certainly not the only reason our college dropout rate is so high, but it's certainly one of the reasons.
In other words, the high cost of college is disguising the payoff of college. There still aren't many better long-term investments than a college degree. Graduates have lower unemployment. They earn more. And the gap between what college and high school graduates make is only growing. But you know what they say about the long-run. It can be awfully hard to get there when the short-run costs are so high. That's why reining in college tuition is so critical. It will both help young graduates struggling with the terrible economy, but also help more people become young graduates.
Of course, it's not obvious how we can do this. If we knew, we'd be doing it. But it's worth remembering: That's how you win the future.
People labeled “smart” at a young age don’t deal well with being wrong. Life grows stagnant.
ASPEN, Colo.—At whatever agesmart people develop the idea that they are smart, they also tend to develop vulnerability around relinquishing that label. So the difference between telling a kid “You did a great job” and “You are smart” isn’t subtle. That is, at least, according to one growing movement in education and parenting that advocates for retirement of “the S word.”
The idea is that when we praise kids for being smart, those kids think: Oh good, I'm smart. And then later, when those kids mess up, which they will, they think: Oh no, I'm not smart after all. People will think I’m not smart after all. And that’s the worst. That’s a risk to avoid, they learn.“Smart” kids stand to become especially averse to making mistakes, which are critical to learning and succeeding.
As he prepares for a presidential run, the governor’s labor legacy deserves inspection. Are his state’s “hardworking taxpayers” any better off?
This past February, at the Conservative Political Action Conference (CPAC) outside Washington, D.C., Wisconsin Governor Scott Walker rolled up his sleeves, clipped on a lavalier microphone, and without the aid of a teleprompter gave the speech of his life. He emerged from that early GOP cattle call as a front-runner for his party’s nomination for president. Numerous polls this spring placed him several points ahead of former Florida Governor Jeb Bush, the preferred candidate of the Republican establishment, in Iowa and New Hampshire. Those same polls showed him with an even more substantial lead over movement conservative favorites such as Ted Cruz, Rand Paul, and Mike Huckabee. In late April, the Koch brothers hinted that Walker would be the likely recipient of the nearly $900 million they plan to spend on the 2016 election cycle.
The untold story of the improbable campaign that finally tipped the U.S. Supreme Court.
On May 18, 1970, Jack Baker and Michael McConnell walked into a courthouse in Minneapolis, paid $10, and applied for a marriage license. The county clerk, Gerald Nelson, refused to give it to them. Obviously, he told them, marriage was for people of the opposite sex; it was silly to think otherwise.
Baker, a law student, didn’t agree. He and McConnell, a librarian, had met at a Halloween party in Oklahoma in 1966, shortly after Baker was pushed out of the Air Force for his sexuality. From the beginning, the men were committed to one another. In 1967, Baker proposed that they move in together. McConnell replied that he wanted to get married—really, legally married. The idea struck even Baker as odd at first, but he promised to find a way and decided to go to law school to figure it out.
Many authors have been tempted into writing revisionist histories of the 37th U.S. president, but these counterintuitive takes often do not hold up under closer scrutiny.
Every once in a while someone writes a book arguing that Richard Nixon has been misunderstood. These authors tend to focus on some particular aspect of his presidency that, the argument goes, is more important than that Watergate business. They’ve focused on his domestic policy or his foreign policy as achievements that override his flaws and his presidency’s denouement. Nixon’s highly complex persona also has led to books that probe his psyche—a hazardous and widely debunked practice, though that hasn’t discouraged further attempts.
And, as with other major figures, but all the more so given the drama of his time on the national stage, Nixon’s complexity and essentially low repute tempts some authors to offer revisionist approaches to his place in history. Such approaches have to be assessed on their own merits, not accepted merely because they’re counterintuitive or receive a lot of attention, as new assessments of the controversial and fascinating Nixon tend to do. Two major revisionist books about Nixon argued that his domestic policy was so expansive, humane, and innovative that it overrides his unfortunate behavior; their accounts relegate Watergate to a far less important role. The problem with these books is that they don’t stand up to close scrutiny.
Mike Huckabee and Ted Cruz are suggesting there might be ways for states and cities to nullify the justices’ ruling. They’re wrong.
The Supreme Court’s decision last week did make gay marriage legal around the nation. Unfortunately for social conservatives, it did not, however, make nullification legal around the nation.
Nullification is the historical idea that states can ignore federal laws, or pass laws that supercede them. This concept has a long but not especially honorable pedigree in U.S. history. Its origins date back to antebellum America, where Southern states tried to nullify tariffs and Northern states tried to nullify fugitive-slave laws. In the 1950s, after Brown v. Board of Education, some Southern states tried to pass laws to avoid integrating schools. It didn’t work, because nullification is not constitutional.
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.
The social network learns more about its users than they might realize.
Facebook, you may have noticed, turned into a rainbow-drenched spectacle following the Supreme Court’s decision Friday that same-sex marriage is a Constitutional right.
By overlaying their profile photos with a rainbow filter, Facebook users began celebrating in a way we haven't seen since March 2013, when 3 million peoplechanged their profile images to a red equals sign—the logo of the Human Rights Campaign—as a way to support marriage equality. This time, Facebook provided a simple way to turn profile photos rainbow-colored. More than 1 million people changed their profile in the first few hours, according to the Facebook spokesperson William Nevius, and the number continues to grow.
“This is probably a Facebook experiment!” joked the MIT network scientist Cesar Hidalgo on Facebook yesterday. “This is one Facebook study I want to be included in!” wrote Stacy Blasiola, a communications Ph.D. candidate at the University of Illinois, when she changed her profile.
Was the Concorde a triumph of modern engineering, a metaphor for misplaced 20th-century values, or both?
The box sat untouched in his bottom desk drawer. For weeks we discussed opening it, and one January morning he was ready. I set the box on his white bedsheets and removed the stack of passports, which could have belonged to a family with dual citizenship. But all nine—from 1956 to a valid update issued in 2014—belong to my 89-year-old grandfather.
Lying in bed, he unfolded a stamp-covered page like an accordion and held it open above his chest. “Oh my,” he kept repeating. He paused, and pointed.
London. March 22, 1976. My then-50-year-old grandfather, Raymond Pearlson, the inventor ofSyncrolift, was traveling the world selling his shiplift system. Concorde had launched commercially that January. He knew exactly what this stamp represented: Washington Dulles to London Heathrow in 3.5 hours—the first of at least 150 supersonic flights he took on the legendary aircraft.
Engineers at IBM and Google claim they're closer than ever to making computers that could process data in days that would take millions of years to flow through today's machines.
One of the first electronic, programmable computers in the world is remembered today mostly by its nickname: Colossus. The fact that this moniker evokes one of the seven wonders of the ancient world is fitting both physically and conceptually. Colossus, which filled an entire room and included dinner-plate-sized pulleys that had to be loaded with tape, was built in World War II to help crack Nazi codes. Ten versions of the mammoth computer would decrypt tens of millions of characters of German messages before the war ended.
Colossus was a marvel at a time when “computers” still referred to people—women, usually—rather than machines. And it is practically unrecognizable by today's computing standards, made up of thousands of vacuum tubes that contained glowing hot filaments. The machine was programmable, but not based on stored memory. Operators used switches and plugs to modify wires when they wanted to run different programs. Colossus was a beast and a capricious one at that.
I spent a year in Tromsø, Norway, where the “Polar Night” lasts all winter—and where rates of seasonal depression are remarkably low. Here’s what I learned about happiness and the wintertime blues.
Located over 200 miles north of the Arctic Circle, Tromsø, Norway, is home to extreme light variation between seasons. During the Polar Night, which lasts from November to January, the sun doesn’t rise at all. Then the days get progressively longer until the Midnight Sun period, from May to July, when it never sets. After the midnight sun, the days get shorter and shorter again until the Polar Night, and the yearly cycle repeats.
So, perhaps understandably, many people had a hard time relating when I told them I was moving there.
“I could never live there,” was the most common response I heard. “That winter would make me so depressed,” many added, or “I just get so tired when it’s dark out.”
But the Polar Night was what drew me to Tromsø in the first place.