Too much demand for liberal arts didn't kill the job market. Too little aggregate demand did.
Is our college students learning?
Rarely is the questionnot asked nowadays. Graduates now face a tough labor market and even tougher debt burdens, which has left many struggling to find work that pays enough to pay back what they owe. Today, as my colleague Jordan Weissmann points out, young alums aren't stuck in dead-end jobs much more than usual (despite the scare stories you may have heard). But that's a cold comfort for grads who borrowed a lot to cover the high cost of their degrees.
There are two, well, schools of thought about why freshly-minted grads have had such a tough time recently. You can blame the smarty-pants majors or blame the economy. In other words, students can't get good jobs either because they aren't learning (at least not the right things) in college, or because there aren't enough good jobs, period.
This is far from an academic debate. If recent grads can't find good work because they didn't learn any marketable skills, there's little the government can do to help, besides "nudging" current students to be more practical. And that's exactly what conservative governors in Florida and North Carolina are considering with proposals to charge humanities majors higher tuition than, say, science majors at state schools.
But there's an obvious question. If liberal arts majors "didn't learn much in school," as Jane Shaw put it in the Wall Street Journal, why haven't they always had trouble finding work? Are there just more of them now, or is this lack of learning just a recent phenomenon? Well, as you can see in the chart below, there's no correlation the past decade between the share of grads in the most maligned majors and the unemployment rate for college grads (which has been inverted here). It's hard to see how the nonexistent rise of liberal arts explains the decline of job prospects.
(Note: I compiled data from the National Center for Education Statistics to come up with the percentage of students in "squishy" majors, which includes gender and cultural studies, English and foreign language literature, liberal arts, philosophy, and theater and visual arts. I multiplied the unemployment rate by -1, so employment falls when the line does).
Now, maybe liberal arts majors stopped learning things circa 2008 ... or maybe something else was happening then. Something like a global financial crisis. Indeed, there's no mystery when it comes to college grad unemployment; it moves in tandem with private non-residential fixed investment (that is, the state of the economy).
In other words, too much demand for liberal arts didn't kill the job market. Too little aggregate demand did. Now, if our policymakers could just learn that....
In late 2015, in the Chilean desert, astronomers pointed a telescope at a faint, nearby star known as ared dwarf. Amid the star’s dim infrared glow, they spotted periodic dips, a telltale sign that something was passing in front of it, blocking its light every so often. Last summer, the astronomers concluded the mysterious dimming came from three Earth-sized planets—and that they were orbiting in the star’s temperate zone, where temperatures are not too hot, and not too cold, but just right for liquid water, and maybe even life.
This was an important find. Scientists for years had focused on stars like our sun in their search for potentially habitable planets outside our solar system. Red dwarfs, smaller and cooler than the sun, were thought to create inhospitable conditions. They’re also harder to see, detectable by infrared rather than visible light. But the astronomers aimed hundreds of hours worth of observations at this dwarf, known as TRAPPIST-1 anyway, using ground-based telescopes around the world and NASA’s Spitzer Space Telescope.
Long after research contradicts common medical practices, patients continue to demand them and physicians continue to deliver. The result is an epidemic of unnecessary and unhelpful treatments.
First, listen to the story with the happy ending: At 61, the executive was in excellent health. His blood pressure was a bit high, but everything else looked good, and he exercised regularly. Then he had a scare. He went for a brisk post-lunch walk on a cool winter day, and his chest began to hurt. Back inside his office, he sat down, and the pain disappeared as quickly as it had come.
That night, he thought more about it: middle-aged man, high blood pressure, stressful job, chest discomfort. The next day, he went to a local emergency department. Doctors determined that the man had not suffered a heart attack and that the electrical activity of his heart was completely normal. All signs suggested that the executive had stable angina—chest pain that occurs when the heart muscle is getting less blood-borne oxygen than it needs, often because an artery is partially blocked.
Rod Dreher makes a powerful argument for communal religious life in his book, The Benedict Option. But he has not wrestled with how to live side by side with people unlike him.
Donald Trump was elected president with the help of 81 percent of white evangelical voters. Mike Pence, the champion of Indiana’s controversial 2015 religious-freedom law, is his deputy. Neil Gorsuch, a judge deeply sympathetic to religious litigants, will likely be appointed to the Supreme Court. And Republicans hold both chambers of Congress and statehouses across the country. Right now, conservative Christians enjoy more influence on American politics than they have in decades.
And yet, Rod Dreher is terrified.
“Don’t be fooled,” he tells fellow Christians in his new book, The Benedict Option. “The upset presidential victory of Donald Trump has at best given us a bit more time to prepare for the inevitable.”
Plagues, revolutions, massive wars, collapsed states—these are what reliably reduce economic disparities.
Calls to make America great again hark back to a time when income inequality receded even as the economy boomed and the middle class expanded. Yet it is all too easy to forget just how deeply this newfound equality was rooted in the cataclysm of the world wars.
The pressures of total war became a uniquely powerful catalyst of equalizing reform, spurring unionization, extensions of voting rights, and the creation of the welfare state. During and after wartime, aggressive government intervention in the private sector and disruptions to capital holdings wiped out upper-class wealth and funneled resources to workers; even in countries that escaped physical devastation and crippling inflation, marginal tax rates surged upward. Concentrated for the most part between 1914 and 1945, this “Great Compression” (as economists call it) of inequality took several more decades to fully run its course across the developed world until the 1970s and 1980s, when it stalled and began to go into reverse.
A $100 million gangster epic starring Robert De Niro, Al Pacino, and Joe Pesci has become too risky a proposition for major studios.
Martin Scorsese’s next project, The Irishman, is as close as you can get to a box-office guarantee for the famed director. It’s a gangster film based on a best-selling book about a mob hitman who claimed to have a part in the legendary disappearance of the union boss Jimmy Hoffa. Robert De Niro is attached to play the hitman, Al Pacino will star as Hoffa, and Scorsese favorites Joe Pesci and Harvey Keitel are also on board. After Scorsese branched into more esoteric territory this year with Silence, a meditative exploration of faith and Catholicism, The Irishman sounds like a highly bankable project—the kind studios love. And yet, the film is going to Netflix, which will bankroll its $100 million budget and distribute it around the world on the company’s streaming service.
Neither truck drivers nor bankers would put up with a system like the one that influences medical residents’ schedules.
The path to becoming a doctor is notoriously difficult. Following pre-med studies and four years of medical school, freshly minted M.D.s must spend anywhere from three to seven years (depending on their chosen specialty) training as “residents” at an established teaching hospital. Medical residencies are institutional apprenticeships—and are therefore structured to serve the dual, often dueling, aims of training the profession’s next generation and minding the hospital’s labor needs.
How to manage this tension between “education and service” is a perennial question of residency training, according to Janis Orlowski, the chief health-care officer of the Association of American Medical Colleges (AAMC). Orlowski says that the amount of menial labor residents are required to perform, known in the profession as “scut work,” has decreased "tremendously" since she was a resident in the 1980s. But she acknowledges that even "institutions that are committed to education … constantly struggle with this,” trying to stay on the right side of the boundary between training and taking advantage of residents.
You can tell a lot about a person from how they react to something.
That’s why Facebook’s various “Like” buttons are so powerful. Clicking a reaction icon isn’t just a way to register an emotional response, it’s also a way for Facebook to refine its sense of who you are. So when you “Love” a photo of a friend’s baby, and click “Angry” on an article about the New England Patriots winning the Super Bowl, you’re training Facebook to see you a certain way: You are a person who seems to love babies and hate Tom Brady.
The more you click, the more sophisticated Facebook’s idea of who you are becomes. (Remember: Although the reaction choices seem limited now—Like, Love, Haha, Wow, Sad, or Angry—up until around this time last year, there was only a “Like” button.)
Yet another failed drug trial has prompted soul-searching about the “amyloid hypothesis.”
Last week, the pharmaceutical company Merck pulled the plug on a closely watched Alzheimer’s drug trial. The drug verubecestat, an outside committee concluded, had “virtually no chance” of benefit for patients with the disease.
The failure of one drugis of course disappointing, but verubecestat is only the latest in a string of failed trials all attempting the same strategy to battle Alzheimer’s. That pattern of failure has provoked some rather public soul-searching about the basic hypothesis that has guided Alzheimer’s research for the past quarter century.
The “amyloid hypothesis” began with a simple observation: Alzheimer’s patients have an unusual buildup of the protein amyloid in their brains. Thus, drugs that prevent or remove the amyloid should slow the onset of dementia. Yet all drugs targeting amyloid—including solanezumab from Eli Lilly and bapineuzumab from Pfizer and Johnson & Johnson, to add a few more high-profile flameouts to the fail pile—have not worked so far.
The Italian philosopher Julius Evola is an unlikely hero for defenders of the “Judeo-Christian West.”
In the summer of 2014, years before he became the White House chief strategist, Steve Bannon gave a lecture via Skype at a conference held inside the Vatican. He spoke about the need to defend the values of the “Judeo-Christian West”—a term he used 11 times—against crony capitalism and libertarian capitalism, secularization, and Islam. He also mentioned the late Julius Evola, a far-right Italian philosopher popular with the American alt-right movement. What he did not mention is that Evola hated not only Jews, but Christianity, too.
References to Evola abounded on websites such as Breitbart News, The Daily Stormer, and AltRight.com well before The New York Timesnoted the Bannon-Evola connection earlier this month. But few have discussed the fundamental oddity of Evola serving as an intellectual inspiration for the alt-right. Yes, the thinker was a virulent anti-Semite and Nazi sympathizer who influenced far-right movements in Italy from the 1950s until his death in 1974, but shouldn’t his contempt for Christianity make him an unlikely hero for those purporting to defend “Judeo-Christian” values?
By excusing Donald Trump’s behavior, some evangelical leaders enabled the internet provocateur’s ascent.
The Conservative Political Action Conference (CPAC) takes place this week near Washington, D.C., the first such gathering since Donald Trump took office. The conference purports to be a gathering for like-minded folks who believe, generally, in the well-established principles of the conservative movement, as enunciated by the American Conservative Union.
This year, aside from President Trump himself, activist Milo Yiannopoulos was briefly granted a featured speaking slot, and it caused a lot of disruption, garment-rending, gnashing of teeth, and in-fighting on the right.
Yiannopoulos, who prefers to go by MILO (yes, capitalized), is a controversial figure with dubious conservative credentials, most famous for being outrageous during speeches on his college campus tour, soberly called the “Dangerous Faggot” tour. Throughout the 2016 election, Yiannopoulos seemed to enjoy nothing quite so much as the crass, antagonistic side of candidate Trump. He didn’t just celebrate it; he rode it like a wave to greater stardom.