Too much demand for liberal arts didn't kill the job market. Too little aggregate demand did.
Is our college students learning?
Rarely is the questionnot asked nowadays. Graduates now face a tough labor market and even tougher debt burdens, which has left many struggling to find work that pays enough to pay back what they owe. Today, as my colleague Jordan Weissmann points out, young alums aren't stuck in dead-end jobs much more than usual (despite the scare stories you may have heard). But that's a cold comfort for grads who borrowed a lot to cover the high cost of their degrees.
There are two, well, schools of thought about why freshly-minted grads have had such a tough time recently. You can blame the smarty-pants majors or blame the economy. In other words, students can't get good jobs either because they aren't learning (at least not the right things) in college, or because there aren't enough good jobs, period.
This is far from an academic debate. If recent grads can't find good work because they didn't learn any marketable skills, there's little the government can do to help, besides "nudging" current students to be more practical. And that's exactly what conservative governors in Florida and North Carolina are considering with proposals to charge humanities majors higher tuition than, say, science majors at state schools.
But there's an obvious question. If liberal arts majors "didn't learn much in school," as Jane Shaw put it in the Wall Street Journal, why haven't they always had trouble finding work? Are there just more of them now, or is this lack of learning just a recent phenomenon? Well, as you can see in the chart below, there's no correlation the past decade between the share of grads in the most maligned majors and the unemployment rate for college grads (which has been inverted here). It's hard to see how the nonexistent rise of liberal arts explains the decline of job prospects.
(Note: I compiled data from the National Center for Education Statistics to come up with the percentage of students in "squishy" majors, which includes gender and cultural studies, English and foreign language literature, liberal arts, philosophy, and theater and visual arts. I multiplied the unemployment rate by -1, so employment falls when the line does).
Now, maybe liberal arts majors stopped learning things circa 2008 ... or maybe something else was happening then. Something like a global financial crisis. Indeed, there's no mystery when it comes to college grad unemployment; it moves in tandem with private non-residential fixed investment (that is, the state of the economy).
In other words, too much demand for liberal arts didn't kill the job market. Too little aggregate demand did. Now, if our policymakers could just learn that....
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
The results of the referendum are, in theory, not legally binding.
Lest we think the Euroskepticism displayed this week by British voters is new, let me present a scene from the BBC’s Yes, Minister, a comedy about the U.K. civil service’s relationship with a minister. The series ran from 1980 to ’84 (and, yes, it was funny), at a time when the European Union was a mere glint in its founders’ eyes.
The Europe being referred to in the scene is the European Economic Community (EEC), an eventually 12-member bloc established in the mid-1950s, to bring about greater economic integration among its members.
In many ways, the seeds of the U.K.’s Thursday referendum on its membership in the European Union were sown soon after the country joined the now-defunct EEC in 1973. Then, as now, the ruling Conservative Party and opposition Labour, along with the rest of the country, were deeply divided over the issue. In the run-up to the general election the following year, Labour promised in its manifesto to put the U.K.’s EEC membership to a public referendum. Labour eventually came to power and Parliament passed the Referendum Act in 1975, fulfilling that campaign promise. The vote was held on June 5, 1975, and the result was what the political establishment had hoped for: an overwhelming 67 percent of voters supported the country’s EEC membership.
Unexpected discoveries in the quest to cure an extraordinary skeletal condition show how medically relevant rare diseases can be.
When Jeannie Peeper was born in 1958, there was only one thing amiss: her big toes were short and crooked. Doctors fitted her with toe braces and sent her home. Two months later, a bulbous swelling appeared on the back of Peeper’s head. Her parents didn’t know why: she hadn’t hit her head on the side of her crib; she didn’t have an infected scratch. After a few days, the swelling vanished as quickly as it had arrived.
When Peeper’s mother noticed that the baby couldn’t open her mouth as wide as her sisters and brothers, she took her to the first of various doctors, seeking an explanation for her seemingly random assortment of symptoms. Peeper was 4 when the Mayo Clinic confirmed a diagnosis: she had a disorder known as fibrodysplasia ossificans progressiva (FOP).
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
The Republican candidate is deeply unpopular, and his Democratic rival is promoting her own version of American nationalism.
American commentators have spent the weekend pondering the similarities between Britain’s vote to leave the European Union and America’s impending vote on whether to take leave it of its senses by electing Donald Trump. The similarities have been well-rehearsed: The supporters of Brexit—like the supporters of Trump--are older, non-college educated, non-urban, distrustful of elites, xenophobic, and nostalgic. Moreover, many British commentators discounted polls showing that Brexit might win just as many American commentators, myself very much included, discounted polls showing that Trump might win the Republican nomination. Brexit may even result in the installation this fall of a new British prime minister, Boris Johnson, who is entertaining, self-promoting, vaguely racist, doughy, and orange. It’s all too familiar.
The June 23 vote represents a huge popular rebellion against a future in which British people feel increasingly crowded within—and even crowded out of—their own country.
I said goodnight to a gloomy party of Leave-minded Londoners a few minutes after midnight. The paper ballots were still being counted by hand. Only the British overseas territory of Gibraltar had reported final results. Yet the assumption of a Remain victory filled the room—and depressed my hosts. One important journalist had received a detailed briefing earlier that evening of the results of the government’s exit polling: 57 percent for Remain.
The polling industry will be one victim of the Brexit vote. A few days before the vote, I met with a pollster who had departed from the cheap and dirty methods of his peers to perform a much more costly survey for a major financial firm. His results showed a comfortable margin for Remain. Ten days later, anyone who heeded his expensive advice suffered the biggest percentage losses since the 2008 financial crisis.
The city is riding high after the NBA final. But with the GOP convention looming, residents are bracing for disappointment.
Cleveland’s in a weird mood.
My son and I attended the Indians game on Father’s Day, the afternoon before game seven of the NBA Finals—which, in retrospect, now seems like it should be blockbustered simply as The Afternoon Before—when the Cavaliers would take on the Golden State Warriors and bring the city its first major-league sports championship in 52 years.
I am 52 years old. I’ve lived in Northeast Ohio all my life. I know what Cleveland feels like. And it’s not this.
In the ballpark that day, 25,269 of us sat watching a pitcher’s duel, and the place was palpably subdued. The announcer and digitized big-screen signage made no acknowledgement of the city’s excitement over the Cavaliers. There were no chants of “Let’s Go Cavs,” no special seventh-inning-stretch cheer for the Indians’ basketball brothers, who play next door in the Quicken Loans Arena, which in a few weeks will host the Republican National Convention.
There's a reason why the first thing we often ask someone when we meet them, right after we learn their name, is "where's home for you?"
There's a reason why the first thing we often ask someone when we meet them, right after we learn their name, is "where's home for you?"
My house is a shrine to my homes. There's a triptych of sunsets next to my bedroom door, dusk forever falling over the small Michigan town where I grew up, the beach next to my college dorm and Place de la Concorde in Paris, where I spent a cliché but nonetheless happy semester. And that's only the beginning. Typographic posters of Michigan and Chicago hang above my bed, a photo of taxis zooming around Manhattan sits atop my dresser and a postcard of my hometown's famous water tower is taped to my door. My roommate and I have an entire wall in our kitchen plastered with maps of places we've been, and twin Ferris wheels, one at Navy Pier, one at Place de la Concorde, are stacked on top of one another in my living room.
A hotly contested, supposedly ancient manuscript suggests Christ was married. But believing its origin story—a real-life Da Vinci Code, involving a Harvard professor, a onetime Florida pornographer, and an escape from East Germany—requires a big leap of faith.
On a humid afternoon this past November, I pulled off Interstate 75 into a stretch of Florida pine forest tangled with runaway vines. My GPS was homing in on the house of a man I thought might hold the master key to one of the strangest scholarly mysteries in recent decades: a 1,300-year-old scrap of papyrus that bore the phrase “Jesus said to them, My wife.” The fragment, written in the ancient language of Coptic, had set off shock waves when an eminent Harvard historian of early Christianity, Karen L. King, presented it in September 2012 at a conference in Rome.
Never before had an ancient manuscript alluded to Jesus’s being married. The papyrus’s lines were incomplete, but they seemed to describe a dialogue between Jesus and the apostles over whether his “wife”—possibly Mary Magdalene—was “worthy” of discipleship. Its main point, King argued, was that “women who are wives and mothers can be Jesus’s disciples.” She thought the passage likely figured into ancient debates over whether “marriage or celibacy [was] the ideal mode of Christian life” and, ultimately, whether a person could be both sexual and holy.
Why professors, librarians, and politicians are shunning liberal arts in the name of STEM
I have been going to academic conferences since I was about 12 years old. Not that I am any sort of prodigy—both of my parents are, or were at one point, academics, so I was casually brought along for the ride. I spent the bulk of my time at these conferences in hotel lobbies, transfixed by my Game Boy, waiting for my mother to be done and for it to be dinnertime. As with many things that I was made to do as a child, however, I eventually came to see academic conferences as an integral part of my adult life.
So it was that, last year, I found myself hanging out at the hotel bar at the annual conference of the Modern Language Association, despite the fact that I am not directly involved with academia in any meaningful way. As I sipped my old fashioned, I listened to a conversation between several aging literature professors about the “digital humanities,” which, as far as I could tell, was a needlessly jargonized term for computers in libraries and writing on the Internet. The digital humanities were very “in” at MLA that year. They had the potential, said a white-haired man in a tweed jacket, to modernize and reinvigorate humanistic scholarship, something that all involved seemed to agree was necessary. The bespectacled scholars nodded their heads with solemn understanding, speaking in hushed tones about how they wouldn’t be making any new tenure-track hires that year.