The story we think we know is that the institution of marriage is crumbling and on the brink of oblivion. The real story is much more complicated.
National Marriage Week USA kicks off today, and for many people, a
national booster movement for marriage could not come any sooner. The recession did a number on American matrimony, as you've surely heard. The collapse in marriage rates is cited as one of the most important symptoms -- or is it a cause? -- of economic malaise for the middle class. But the statistics aren't always what they seem, and the reasons behind marriage's so-called decline aren't all negative.
At first blush, the institution of marriage is crumbling. In 1960, 72% of all adults over 18 were married. By 2010, the number fell to 51%. You can fault the increase in divorces that peaked in the 1970s. Or you could just blame the twentysomethings. The share of married adults 18-29 plunged from from 59% in 1960 to 20% in 2010. Twenty percent!
The simplest summary of their findings is: It's really, really complicated. The full answer for the delay and decline of marriage would touch on birth control technology (which extends courtships by reducing the cost of waiting to get married), liberal divorce laws (which creates "churn" in the labor market by increasing divorces and new marriages), and even washing/drying machines (which both eliminate the need for men to marry lower-earning women to do housework and also free up women to work and study).
One important lesson from Stevenson and Wolfers is that, as much as it feels like things are changing very rapidly, a longer view on marriage trends reveals a more boring picture. If you pull back the lens, not to the 1960s but to the 1860s, the marriage rate and the divorce rate stick stubbornly to
long-term trend lines.
Marriages-per-thousand people are declining,
but slowly, after spiking in the 1940s. Divorces-per-thousand people are rising, but slowly, after spiking in the 1970s. Even in the Great Recession, which theoretically scared couples from investing in matrimony, we've seen "the same rate of decline that existed during the preceding economic boom, the previous bust and both the boom and the bust before that," Wolfers wrote.
The median age
of marriage for men and women is rising slowly into the high-20s. But that's not so unique compared to men's historical averages. What is new -- really, really new -- is the rising marriage ages for women.
The education revolution for women -- one of the happiest trends of the 20th century -- carries important implications for the marriage market. First, if women are going to college, more of their 18-22 years will be taken up by history classes rather than husbands. Second, when these women start earning money, fewer need to marry for financial reasons, which means they can afford to put off marriage. Thanks to birth control, a little bit of biotechnology has helped their cause by reducing the costs of being sexually active and single.
THE TWO MARRIAGE TRENDS
The economics of marriage suggest it's like any other investment. Women are more likely to get hitched when they see big potential gains from a union. That explains why the apparently monolithic Decline of Marriage is in fact two polar opposite trends. The first points toward the revitalization of marriage. The second points to decline.
First, for highly-educated or rich women, marriage rates are actually rising. It was once the case that a college degree was the equivalent of punching your spinster card. In the late 1800s, half of all college-educated women never married. But in the last 40 years, marriage rates have increased for the top 10 percent of female earners more than any other group, Michael Greenstone and Adam Looney found in a new report from The Hamilton Project. A 2004 American Community Survey also found college-educated women were 10 percentage points more likely to be currently married than women with less education.
Second, there is a parallel trend that really does look like the crumbling of marriage. Among less-educated and poorer women, marriage is in outright decline. The bottom half of female earners saw their marriage rates decline by 25 percentage points, Greenstone and Looney find and show in the graph above.
There is evidence that social and economic sorting in America creates clusters of people who match up by education, salary, and politics. As a result, perhaps poor or under-educated women are more likely to be matched with poor or less-educated men who offer a worse return on the marriage investment.
"Marriage has become the fault line dividing America's classes,"
Charles Murray writes in his newly book on the state of white America, Coming Apart. In a Wall Street Journal
column based on the book, Murray compared those living in the country's
educated white-collar world ("Belmont") and less-educated blue-collar world ("Fishtown"). The divergence between Belmont and Fishtown begins at the alter,
he argued:
In 1960, extremely high proportions of whites in both Belmont and
Fishtown were married--94% in Belmont and 84% in Fishtown. In the 1970s,
those percentages declined about equally in both places. Then came the
great divergence. In Belmont, marriage stabilized during the mid-1980s,
standing at 83% in 2010. In Fishtown, however, marriage continued to
slide; as of 2010, a minority (just 48%) were married. The gap in
marriage between Belmont and Fishtown grew to 35 percentage points, from
just 10.
The decline of marriage in Fishtown matters, Murray says,
because we have an abiding national interest in seeing children raised in two-parent households. Murray might be overstating the importance of living with married parents, as opposed to couples in cohabitation (which is way up in the last 20 years). But if he's right, the following statistic should scare the heck out of you: In 1970, only 6% of births
to undereducated "Fishtown" women were out of wedlock; by 2008, it had grown to a whopping 44%.
MEN AND MARRIAGE
If Pride and Prejudice were written today, it might begin, "It is a truth universally acknowledged, that a single man in possession of a good fortune is practically an endangered species." In the last two years, this magazine has published two cover stories -- on the End of Men and the rise of the Single Lady -- that posited that "a marriage regime based on men's overwhelming economic dominance may be passing into extinction."
Marriage Extinction Watch is on high alert in the poorer segments of the country. In 2007, among women without a high-school diploma, just 43 percent were married. One reason was that the men they're mostly likely to marry are faring so poorly. The wages for all prime-age men have fallen by 40% in the last 40 years,
after you account for the fifth of these men who have dropped out of
the labor force. As the Hamilton Project reported last week, falling male earnings track closely to declining marriage rates. "At the bottom 25th percentile of earnings ... half of men are married, compared with 86 percent
in 1970," Greenstone and Looney write.
We cannot know why millions of couples who might have been tying the knot 40 years ago aren't doing so today. Cohabitation is a factor. Divorce is a factor. But so is economics.
It might help to think of the marriage marketplace as, well, a marketplace. Historically, men and women have gone to the market to marry for the same reason that employers go the market to hire. They are looking to hook up with a productive "partner." As women are likely to seek partners in their income class, poorer women are more likely to be surrounded by men with low and falling fortunes, and more have chosen to forgo a union that could become a financial drain.
'TIL DEATH: A SILVER LINING?
If twentysomethings are responsible for inspiring "end of marriage" talk, let's praise seniors for keeping the institution alive by staying alive. As the graph above shows, today's Americans aged 60 and higher are as likely to be married as any other generation before them at that age. In fact, those over 65 are now more
likely to be married as those in their 20s -- a moment that is unique in
history. Adults are living longer, having fewer children, divorcing at a
higher rate, and finding new partners, Stevenson and Wolfers write.
The graph above is a deceptively simple picture that says a lot about how the institution of marriage has changed in the last 130 years. First, it shows how unusually early twentysomethings married around 1960. This suggests that comparisons to that generation imply an exaggerated collapse. Second it shows that, at every age up to 60, today's Americans are less likely to be hitched than any generation before them. Third, it suggests that seniors are marrying close to their age. The shrinking gap between the ages of husbands and wives that helps to explain why couples are more likely to sort within their income group. Finally, it implies that even with rising divorces, the market for re-marriages is strong.
If somebody tells you nobody your age is getting married anymore, the optimistic rejoinder is: Just wait. Marriage isn't dead. It's just changed, and running a few years behind schedule.
The Republican nominee long used the media to project his fairy tale self-image but now blames the industry for his flailing campaign.
Long addicted to media attention, Donald Trump is like strung-out junkie, blaming heroin for his fall.
He will never recover, because he will never fault himself.
The self-professed billionaire and serial bankruptcy filer built his career on a singular strength: an ability to manipulate the media to project his fairy tale self-image. Never as rich or as smart or as powerful or as respected or (God forbid) as sexual as he projected himself to be, Trump now bashes the industry that made him rather than face the truth.
Like the hero of a Greek tragedy, his strength becomes his undoing.
"If the disgusting and corrupt media covered me honestly and didn't put false meaning into the words I say, I would be beating Hillary by 20%," Trump said Sunday in one of seven anti-media tweets.
The negative press the Republican nominee is receiving is mostly his own fault.
Is the news media biased against Donald Trump?
That charge has been aired in recent days not only by the billionaire candidate, who negs CNN, The New York Times, and the press generally at almost every opportunity, but by several thoughtful political commentators who don’t much like him.
These media critics all cited the same example: coverage of the Republican nominee’s controversial statement that President Obama was “the founder of ISIS.”
That coverage was hardly uniform.
Overgeneralizing in a way that obscured the diversity of approaches different journalists took to the story, Mollie Hemingway of The Federalistwrote, “The media immediately decided Trump was claiming that Obama had literally incorporated ISIS a few years back. And they treated this literal claim as a fact that needed to be debunked.”
Donald Trump says the press cannot lie. But the Supreme Court says otherwise: In fact, many falsehoods are protected speech.
“It is not ‘freedom of the press’ when newspapers and others are allowed to say and write whatever they want even if it is completely false!” Donald Trump tweeted on Sunday.
I will refrain from belaboring the contradictions in his concern for the danger of false statements in public life, because the question he raised is an important one.
Does the First Amendment really protect false statements of fact? Does it really protect deliberate lies?
The idea that lies are part of “freedom of speech” or “of the press” seems wrong. Lies—even lazy falsehoods—make finding the truth harder, erode mutual trust, and harm individuals and groups. Some can even lead to private violence or public disorder.
The lonely poverty of America’s white working class
For the last several months, social scientists have been debating the striking findings of a study by the economists Anne Case and Angus Deaton.* Between 1998 and 2013, Case and Deaton argue, white Americans across multiple age groups experienced large spikes in suicide and fatalities related to alcohol and drug abuse—spikes that were so large that, for whites aged 45 to 54, they overwhelmed the dependable modern trend of steadily improving life expectancy. While critics have challenged the magnitude and timing of the rise in middle-age deaths (particularly for men), they and the study’s authors alike seem to agree on some basic points: Problems of mental health and addiction have taken a terrible toll on whites in America—though seemingly not in other wealthy nations—and the least educated among them have fared the worst.
Some theories on why the giant insurer says it will walk away from 70 percent of its Affordable Care Act markets
On Tuesday, Aetna, the insurance giant, announced that it would decamp from Affordable Care Act health exchanges in 11 of 15 states in which it currently operates. Citing a $200 million pre-tax loss in the second quarter of 2016, the company says it will walk away from nearly 70 percent of its plans, a move that will leave at least one U.S. county in Arizona without an ACA provider. "As a strong supporter of public exchanges as a means to meet the needs of the uninsured, we regret having to make this decision," said Aetna CEO Mark Bertolini in a statement.
The development is the latest evidence that, years after its implementation, the Affordable Care Act still represents an unhappy medium between those who want a more robust government-run healthcare system, those who want to cede healthcare to the private sector, and the insurance and medical industries. Texas Senator and failed Republican presidential candidate Ted Cruz, for example, tweeted out the news of Aetna’s decision with the hashtag #FULLREPEAL while Robert Reich, the former Labor Secretary under President Bill Clinton, characterized the development as “the best argument for a single-payer health plan.”
The Republican presidential nominee spun stories about the events surrounding September 11 to demonstrate his mettle. Does it matter that much of it was fiction?
On Monday afternoon, Donald Trump delivered one of his most extended and substantive speeches yet on the subject of foreign policy and national security. In it, he offered a variety of proposals. Some, like coalition-building and cyber security, were fairly old hat, if not decidedly redundant. Others, like the establishment of a Commission on Radical Islam and “extreme vetting” to screen would-be terrorists seeking to enter the country, were alarmingly creative, if not necessarily authorized under the Constitution.
As is very often the case on the stump these days, Trump made several mentions of the attacks on September 11, which took place 15 years ago next month. Without question, the attacks of that day had a profoundly destructive effect on the United States, but less discussed is the fact that they had a remarkably constructive effect, too: September 11 has given rise to a multitude of narratives—about the event, about the country that withstood it, about the character of its people. The practical effect of a useful narrative on foreign policy can’t be overstated (just ask the Bush and Obama administrations), and Trump is a master storyteller, insofar as he is a compelling narrator (if not necessarily a truthful one), and is adept at creating villains and heroes, to the delight of large audiences across the country. Yesterday he, too, established another narrative about September 11, this one to showcase his wisdom, instincts and fiscal savoir-faire.
They’re both right about the U.S. needing more infrastructure spending. It’s Washington Republicans who are wrong.
It is one of the few things that Donald Trump and Hillary Clinton agree on: The U.S. needs to spend more money on itself.
Clinton has proposed$275 billion in new infrastructure spending over the next five years, to go along with another $25 billion in loans and guarantees to support private-sector investments. Trump, a builder by blood, has pledged to double that figure, at least. He has called for spending up to $1 trillion on new roads, bridges, broadband, and more. (And that’s not even counting the fantastical Mexican-financed wall.)
The economic need for more infrastructure is obvious. There is considerable work to be done on the intranational capillaries and arteries and valves that circulate Americans around the country. The American Society of Civil Engineers estimates that there is more than $3 trillion of infrastructural work to do before 2020 in order repair, reinforce, and rebuild America’s circuitry, including almost two trillion for roads and bridges and several hundred billion more for airports and waterways.
He was the first whistleblower charged under the Espionage Act—and his trial set the pattern for how the government treats unauthorized disclosure of classified information.
Jack Nickerson watched the papers burn. It was the afternoon of January 2, 1957, and an overnight cold snap had descended on northern Alabama, pushing daytime temperatures to near freezing. His neighbors would think nothing of the smoke wafting from the chimney of the large antebellum colonial he shared with his wife and four children. There weren’t all that many neighbors anyway. The house sat on an isolated corner of Redstone Arsenal, a sprawling Army base in Huntsville. A reservoir surrounded it on three sides.
Nickerson, a colonel, had good reason to avoid attention. For the past three hours, he had scoured his office and home for copies of the documents that now sat stacked next to the fireplace, growing shorter by the minute. The word “SECRET” crumpled and blackened before being consumed by the flames.
Jared Leto’s turn in Suicide Squad is the latest reminder that the technique has become more about ego and marketing than good performances.
Of all the stories surfacing about the new DC Comics film Suicide Squad—from the dismal reviews to the box-office reports—the most disconcerting are the ones that detail how Jared Leto got into his role as the Joker. Leto was reportedly so committed to the part that he gifted the cast and crew with a litany of horrible items: used condoms, a dead pig, a live rat. To get into the character’s twisted mindset, he also watched footage of brutal crimes online. “The Joker is incredibly comfortable with acts of violence,” he told Rolling Stone. “I was watching real violence, consuming that.There’s a lot you can learn from seeing it.”
Watching Leto tell one disturbing tale after another makes one thing abundantly clear: Method acting is over. Not the technique itself, which has fueled many of cinema’s greatest performances and can be a useful way of approaching difficult roles. But Leto’s stories show how going to great lengths to inhabit a character is now as much a marketing tool as it is an actual technique—one used to lend an air of legitimacy, verisimilitude, and importance to a performance no matter its quality. Leto’s Joker is the latest evidence that the prestige of method acting has dimmed—thanks to the technique’s overuse by those seeking award-season glory or a reputation boost, as well as its history of being shaped by destructive ideas of masculinity.
More than 150 years ago, Frederick Law Olmsted changed how Americans think about public space.
A century and a half ago, city dwellers in search of fresh air and rural pastures visited graveyards. It was a bad arrangement. The processions of tombstones interfered with athletic activity, the gloom with carefree frolicking. Nor did mourners relish having to contend with the crowds of pleasure-seekers. The phenomenon particularly maddened Frederick Law Olmsted. He repeatedly complained of it in his essays and letters, which have been collected by the Library of America in Writings on Landscape, Culture, and Society (a digest of Johns Hopkins University Press’s projected 12-volume set of Olmsted’s papers). A “miserably imperfect form,” Olmsted lamented. “A wretched pretext.” The cemetery problem, he felt, was an expression of a profound, universal desire that cities were neglecting to meet: the desire for public parks.