A reflection on the useless taboos that surround female nudity.
The tragic story of Amanda Todd is making the rounds. In seventh grade she met a guy online who told her she was beautiful and successfully persuaded her to flash her breasts during a video chat. He contacted her months later, having somehow figured out her identity, and tried to blackmail her with a screenshot. She shared her story in a heartbreaking video, chronicling how the photograph of her breasts was circulated among peers. It prompted merciless bullying. "Between the cyber-bullying and real-life harassment, the girl had a
meltdown, began drinking, doing drugs, spiraled into depression, cutting
herself," Rod Dreher writes," adding that "she has a poignant line about
how that one image, on the Internet, lives forever." Watch for yourself:
"Melodramatic, emotionally troubled, even suicidal teens are nothing
new. What got to me about this was the role technology in the hands of a
malicious person played in driving this girl to murder herself," Dreher wrote. "Do you know Nietzsche's idea of Eternal Return? That we should act as
if everything we do would have to be repeated forever. These days,
simply as a precaution, teenagers should be taught to act as if
everything they do will be online forever. Grim, but there you are."
As a parent I'll warn my kids about the permanence of the Web, its perils and how to avoid them. I'll particularly want any child of mine to understand the potential consequences of naked images of their bodies winding up online. It's prudent to teach kids how to navigate prevailing social norms, whatever they may be. But don't stories like this one demand something more from us than cautioning? When a child is bullied to the point of suicide partly because a photo of her breasts was circulated to her friends and family, shouldn't we ask ourselves why the Anglosphere retains social norms wherein being seen topless is regarded as horrifying and shameful?
Bullying is as troublesome culprit in this case as whatever pretext prompted it. The stigma against female nudity is nevertheless something that costs women the world over very dearly. And it benefits none of the places where it prevails. Think of earth as a great natural experiment, where certain parts of Scandinavia think nothing of co-ed naked saunas, and certain parts of the Middle East require women to cover themselves in head-to-toe burkas on the street. How many Americans, Canadians, or Brits believe societies that enforce female modesty are better off? Or that countries where immodesty is most stigmatized are more moral or functional?
Yet we stigmatize the human body.
It is appropriate to castigate the photographer who captured images of Kate Middleton, the Duchess of Cambridge, sunbathing topless. For lucre, he needlessly humiliated someone, knowing the pain that it would cause. But there's more to the story. The coverage of the episode is perfectly summed up by the cover People magazine chose:
Given prevailing social norms, perhaps this was a nightmare for Duchess Middleton. If so, that would be an understandable reaction. But what does it say about our culture that it's plausibly a "nightmare" for a physically attractive 30-year-old woman to be seen topless at a private home with her husband? I wouldn't dream of criticizing any Duchess Middleton reaction to this. In a similar position I might well be very upset at the invasion of privacy. What I couldn't help but imagine is how awesome it would've been had Middleton called a press conference on a nude beach, arrived topless with a thousand women, and told the assembled press, "The photographer who invaded my privacy had no right to capture those images, but I face that nightmare on a daily basis. And no one gives a damn until one of them photographs me topless? Grow up. I am unashamed of my body. In fact, I rather love it, as all these woman love their bodies. That makes some immature people uncomfortable. And it is their problem, not mine. If you're sitting at home obsessing over photos of me topless, or giggling and pointing on the streets, it's you who should feel embarrassment and shame, not me. I refuse to do it anymore."
Ours is a society where that People cover makes sense, and that speech would never happen. We're doing it wrong.
Note the subjects that are not being discussed here: sexual intercourse, hookups, abortion, religiosity, secularism, moral relativism. The impulse for many social and cultural conservatives will be to reject what I've written. I am interested in having that conversation and teasing out our assumptions. To preemptively clarify what I'm saying, permit me to remind you about Janet Jackson:
Above is the infamous Super Bowl halftime show that ended with her breast exposed for a split second. What bothered me about the ensuing controversy wasn't that some parents found the halftime show inappropriate for their kids, and complained about it through formal and informal channels. After all, the lyrics and choreography are rife with sexual innuendo and simulated sex acts.
Would I want my seven-year-old watching it?
I would not.
What boggles my mind is that most people never would've been upset if it weren't for the nipple slip. They were perfectly content sitting through five minutes of sexually suggestive content with their kids, only to freak out at a nipple, as if the exposed body part itself was the problem. I can imagine a lot of uncomfortable questions that show might prompt from a seven year old. "What's a nipple, daddy?" is a question I'd much rather tackle. We've all got them, after all (save our mannequins, which are less anatomically correct than in France or Spain or Argentina).
When I was twenty I spent a summer studying in Paris. I'd somehow persuaded Florida State University to let me tag along on their summer abroad program. I ate little but baguettes and pasta so that I could afford a weekend trip down to Nice and Monte Carlo with some classmates.
It's there that I set foot on my first topless beach.
At first my female classmates sunbathed in the American style. 45 minutes later they said to hell with it, took their tops off, and left the guys feeling slightly awkward and titillated for about 5 minutes, when everyone's notion of normal re-calibrated. That's how fast the mental adjustment happens.
Most people have the same experience at nude beaches. It feels weird, and soon enough ... it doesn't. In places where women must wear head scarves, exposed locks can turn heads. In New York City, exactly no one thinks bare heads are sexually provocative, and New Yorkers have their heads turned on beaches in Rio until they don't. Sexual attraction is a force of nature. It is a proper function of civilization to bound it. Though shalt not rape is a useful norm. Treat others as you'd want to be treated is a useful norm. It is shameful to let people see your breasts is a useless norm. Those who think otherwise at once give men too much and too little credit -- too little in that the site of bare breasts is not enough to corrupt men; too much in that no matter how women dress, there is no getting around the fact that many men will lust after them.
Amanda Todd's story is a lot more complicated than an inane, pervasive taboo against exposed breasts. She felt foolish partly because a stranger she trusted betrayed her; she was bullied partly due to violating taboos against promiscuity, not just nudity (taboos that could themselves be the subject of a long critique). But it remains the case that her story wouldn't have been possible save for the flawed norms that make a big deal out of nudity, cloak it in shame and conflate it with especially transgressive promiscuity. Along with the bullying and slut-shaming that helped drive her to suicide, that norm deserves to be attacked. Yes, let's caution our kids about its existence. Let's also teach them that it's incorrect, that the human body is nothing to feel shame over, that the bullies are not merely unkind, but wrong on the merits. Let's raise kids who don't grow up to be offended by nipple slips, topless beaches, or mothers breastfeeding in public, and are therefore less vulnerable to youthful mistakes, rogue photographers, and slut-shaming.
Conor Friedersdorf is a staff writer at The Atlantic, where he focuses on politics and national affairs. He lives in Venice, California, and is the founding editor of The Best of Journalism, a newsletter devoted to exceptional nonfiction.
On Tuesday, the late-night host once again devoted his show to the politics of American health care. This time, though, he offered indignation rather than tears.
“By the way, before you post a nasty Facebook message saying I’m politicizing my son’s health problems, I want you to know: I am politicizing my son’s health problems.”
That was Jimmy Kimmel on Tuesday evening, in a monologue reacting to the introduction of Graham-Cassidy, the (latest) bill that seeks to replace the Affordable Care Act. Kimmel had talked about health care on his show before, in May—when, after his newborn son had undergone open-heart surgery to repair the damage of a congenital heart defect, he delivered a tearfully personal monologue sharing the experience of going through that—and acknowledging that he and his family were lucky: They could afford the surgery, whatever it might cost. Kimmel concluded his speech by, yes, politicizing his son’s health problems: He emphasized how important it is for lower- and middle-class families to have comprehensive insurance coverage, with protections for people with preexisting conditions. “No parent,” he said, speaking through tears, “should ever have to decide if they can afford to save their child’s life. It shouldn’t happen.”
The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.
It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.
Its faith-based 12-step program dominates treatment in the United States. But researchers have debunked central tenets of AA doctrine and found dozens of other treatments more effective.
J.G. is a lawyer in his early 30s. He’s a fast talker and has the lean, sinewy build of a distance runner. His choice of profession seems preordained, as he speaks in fully formed paragraphs, his thoughts organized by topic sentences. He’s also a worrier—a big one—who for years used alcohol to soothe his anxiety.
J.G. started drinking at 15, when he and a friend experimented in his parents’ liquor cabinet. He favored gin and whiskey but drank whatever he thought his parents would miss the least. He discovered beer, too, and loved the earthy, bitter taste on his tongue when he took his first cold sip.
His drinking increased through college and into law school. He could, and occasionally did, pull back, going cold turkey for weeks at a time. But nothing quieted his anxious mind like booze, and when he didn’t drink, he didn’t sleep. After four or six weeks dry, he’d be back at the liquor store.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
Trump’s bellicosity undermines his ability to deter the Kim regime’s nuclear weapons and missiles programs.
How are we to make sense of the president of the United States—a man with unitary launch authority for over a thousand nuclear weapons—going before the United Nations General Assembly and threatening to annihilate a sovereign state? That’s exactly what President Donald Trump did on Tuesday, halfway into a long, winding speech on everything from sovereignty to UN funding. “The United States has great strength and patience, but if it is forced to defend itself or its allies, we will have no choice but to totally destroy North Korea,” Trump read carefully from his teleprompter. In one breath, he touted the virtues of the nation-state and sovereignty and, in another, promised the utter destruction of a sovereign state.
“If the world’s major powers can’t agree on what the UN is for, what does that mean for its future?”
Since the Second World War, American presidents have repeatedly gone before the United Nations General Assembly and made a similar argument: The United States has national interests just like any other country, but in the modern era those interests are increasingly international in scope and shared by people around the world, requiring more of the multilateral cooperation that the UN was founded to foster.
John F. Kennedy argued that nuclear weapons necessitated “one world and one human race, with one common destiny” guarded by one “world security system,” since “absolute sovereignty no longer assures us of absolute security.” Richard Nixon spoke of a “world interest” in reducing economic inequality, protecting the environment, and upholding international law, declaring that the “profoundest national interest of our time” is the “preservation of peace” through international structures like the UN. In rejecting tribalism and the walling-off of nations, Barack Obama asserted that “giving up some freedom of action—not giving up our ability to protect ourselves or pursue our core interests, but binding ourselves to international rules over the long term—enhances our security.” These presidents practiced what they preached to varying degrees, and there’s long been a debate in the United States about the extent to which America’s sovereign powers should be ceded to international organizations, but in broad strokes the case for global engagement was consistent.
Today’s young children are working more, but they’re learning less.
Step into an American preschool classroom today and you are likely to be bombarded with what we educators call a print-rich environment, every surface festooned with alphabet charts, bar graphs, word walls, instructional posters, classroom rules, calendars, schedules, and motivational platitudes—few of which a 4-year-old can “decode,” the contemporary word for what used to be known as reading.
Because so few adults can remember the pertinent details of their own preschool or kindergarten years, it can be hard to appreciate just how much the early-education landscape has been transformed over the past two decades. The changes are not restricted to the confusing pastiche on classroom walls. Pedagogy and curricula have changed too, most recently in response to the Common Core State Standards Initiative’s kindergarten guidelines. Much greater portions of the day are now spent on what’s called “seat work” (a term that probably doesn’t need any exposition) and a form of tightly scripted teaching known as direct instruction, formerly used mainly in the older grades, in which a teacher carefully controls the content and pacing of what a child is supposed to learn.
The bill would take funding from governments facing public-health crises to provide a short-term boon to a smaller number of states that have refused to expand Medicaid.
“Obamacare, for whatever reason, favors four blue states against the rest of us.” So South Carolina Senator Lindsey Graham, in a floor speech on Monday, defended the central rationale of his Obamacare replacement, the Graham-Cassidy bill. In that speech and other statements, Graham has cast his bill as a redistribution, taking federal Obamacare money poured into the liberal bastions of California, New York, Massachusetts, and Maryland, and giving some of it to cash-strapped red states that have been left out, and whose sicker populations have languished. In this telling, Graham is Robin Hood, and his co-sponsors Bill Cassidy of Louisiana, Dean Heller of Nevada, and Ron Johnson of Wisconsin are his merry men.
Donald Trump used his first address at the United Nations to redefine the idea of sovereignty.
Donald Trump’s first speech to the United Nations can best be understood as a response to his predecessor’s final one. On September 20, 2016, Barack Obama told the UN General Assembly that “at this moment we all face a choice. We can choose to press forward with a better model of cooperation and integration. Or we can retreat into a world sharply divided, and ultimately in conflict, along age-old lines of nation and tribe and race and religion.”
Three hundred and sixty-four days later, Trump delivered America’s answer: Option number two. His speech on Tuesday turned Obama’s on its head. Obama focused on overcoming the various challenges—poverty, economic dislocation, bigotry, extremism—that impede global “integration,” a term he used nine times. Trump didn’t use the term once. Obama used the word “international” 14 times, always positively (“international norms,” “international cooperation,” “international rules,” “international community”). Trump used it three times, in each case negatively (“unaccountable international tribunals,” “international criminal networks,” “the assassination of the dictator's brother using banned nerve agents in an international airport”) Obama warned of a world “sharply divided… along age-old lines of nation and tribe and race and religion.” Trump replied by praising “sovereignty” or invoking “sovereign” no fewer than 19 times. And while he didn’t explicitly defend divisions of “tribe and race and religion,” he talked about the importance of nations “preserving the cultures,” which is a more polite way of saying the same thing.
A new book by the economist Tim Harford on history’s greatest breakthroughs explains why barbed wire was a revolution, paper money was an accident, and HVACs were a productivity booster.
In the beginning, it wasn’t the heat, but the humidity. In 1902, the workers at Sackett & Wilhelms Lithographing & Printing Company in New York City were fed up with the muggy summer air, which kept morphing their paper and ruining their prints. To fix the problem, they needed a humidity-control system. The challenge fell to a young engineer named Willis Carrier. He devised a system to circulate air over coils that were cooled by compressed ammonia. The machine worked beautifully, alleviating the humidity and allowing New York’s lithographers to print without fear of sweaty pages and runny ink.
But Carrier had a bigger idea. He recognized that a weather-making device to control humidity had even more potential to control heat. He went on to mass-manufacture the first modern air-conditioning unit at the Carrier Corporation (yes, that Carrier Corporation), which is still one of the largest HVAC manufacturers in the world. Air-conditioning went on to change far more than modern printing—it shaped global productivity, migration, and even politics.