A reflection on the useless taboos that surround female nudity.
The tragic story of Amanda Todd is making the rounds. In seventh grade she met a guy online who told her she was beautiful and successfully persuaded her to flash her breasts during a video chat. He contacted her months later, having somehow figured out her identity, and tried to blackmail her with a screenshot. She shared her story in a heartbreaking video, chronicling how the photograph of her breasts was circulated among peers. It prompted merciless bullying. "Between the cyber-bullying and real-life harassment, the girl had a
meltdown, began drinking, doing drugs, spiraled into depression, cutting
herself," Rod Dreher writes," adding that "she has a poignant line about
how that one image, on the Internet, lives forever." Watch for yourself:
"Melodramatic, emotionally troubled, even suicidal teens are nothing
new. What got to me about this was the role technology in the hands of a
malicious person played in driving this girl to murder herself," Dreher wrote. "Do you know Nietzsche's idea of Eternal Return? That we should act as
if everything we do would have to be repeated forever. These days,
simply as a precaution, teenagers should be taught to act as if
everything they do will be online forever. Grim, but there you are."
As a parent I'll warn my kids about the permanence of the Web, its perils and how to avoid them. I'll particularly want any child of mine to understand the potential consequences of naked images of their bodies winding up online. It's prudent to teach kids how to navigate prevailing social norms, whatever they may be. But don't stories like this one demand something more from us than cautioning? When a child is bullied to the point of suicide partly because a photo of her breasts was circulated to her friends and family, shouldn't we ask ourselves why the Anglosphere retains social norms wherein being seen topless is regarded as horrifying and shameful?
Bullying is as troublesome culprit in this case as whatever pretext prompted it. The stigma against female nudity is nevertheless something that costs women the world over very dearly. And it benefits none of the places where it prevails. Think of earth as a great natural experiment, where certain parts of Scandinavia think nothing of co-ed naked saunas, and certain parts of the Middle East require women to cover themselves in head-to-toe burkas on the street. How many Americans, Canadians, or Brits believe societies that enforce female modesty are better off? Or that countries where immodesty is most stigmatized are more moral or functional?
Yet we stigmatize the human body.
It is appropriate to castigate the photographer who captured images of Kate Middleton, the Duchess of Cambridge, sunbathing topless. For lucre, he needlessly humiliated someone, knowing the pain that it would cause. But there's more to the story. The coverage of the episode is perfectly summed up by the cover People magazine chose:
Given prevailing social norms, perhaps this was a nightmare for Duchess Middleton. If so, that would be an understandable reaction. But what does it say about our culture that it's plausibly a "nightmare" for a physically attractive 30-year-old woman to be seen topless at a private home with her husband? I wouldn't dream of criticizing any Duchess Middleton reaction to this. In a similar position I might well be very upset at the invasion of privacy. What I couldn't help but imagine is how awesome it would've been had Middleton called a press conference on a nude beach, arrived topless with a thousand women, and told the assembled press, "The photographer who invaded my privacy had no right to capture those images, but I face that nightmare on a daily basis. And no one gives a damn until one of them photographs me topless? Grow up. I am unashamed of my body. In fact, I rather love it, as all these woman love their bodies. That makes some immature people uncomfortable. And it is their problem, not mine. If you're sitting at home obsessing over photos of me topless, or giggling and pointing on the streets, it's you who should feel embarrassment and shame, not me. I refuse to do it anymore."
Ours is a society where that People cover makes sense, and that speech would never happen. We're doing it wrong.
Note the subjects that are not being discussed here: sexual intercourse, hookups, abortion, religiosity, secularism, moral relativism. The impulse for many social and cultural conservatives will be to reject what I've written. I am interested in having that conversation and teasing out our assumptions. To preemptively clarify what I'm saying, permit me to remind you about Janet Jackson:
Above is the infamous Super Bowl halftime show that ended with her breast exposed for a split second. What bothered me about the ensuing controversy wasn't that some parents found the halftime show inappropriate for their kids, and complained about it through formal and informal channels. After all, the lyrics and choreography are rife with sexual innuendo and simulated sex acts.
Would I want my seven-year-old watching it?
I would not.
What boggles my mind is that most people never would've been upset if it weren't for the nipple slip. They were perfectly content sitting through five minutes of sexually suggestive content with their kids, only to freak out at a nipple, as if the exposed body part itself was the problem. I can imagine a lot of uncomfortable questions that show might prompt from a seven year old. "What's a nipple, daddy?" is a question I'd much rather tackle. We've all got them, after all (save our mannequins, which are less anatomically correct than in France or Spain or Argentina).
When I was twenty I spent a summer studying in Paris. I'd somehow persuaded Florida State University to let me tag along on their summer abroad program. I ate little but baguettes and pasta so that I could afford a weekend trip down to Nice and Monte Carlo with some classmates.
It's there that I set foot on my first topless beach.
At first my female classmates sunbathed in the American style. 45 minutes later they said to hell with it, took their tops off, and left the guys feeling slightly awkward and titillated for about 5 minutes, when everyone's notion of normal re-calibrated. That's how fast the mental adjustment happens.
Most people have the same experience at nude beaches. It feels weird, and soon enough ... it doesn't. In places where women must wear head scarves, exposed locks can turn heads. In New York City, exactly no one thinks bare heads are sexually provocative, and New Yorkers have their heads turned on beaches in Rio until they don't. Sexual attraction is a force of nature. It is a proper function of civilization to bound it. Though shalt not rape is a useful norm. Treat others as you'd want to be treated is a useful norm. It is shameful to let people see your breasts is a useless norm. Those who think otherwise at once give men too much and too little credit -- too little in that the site of bare breasts is not enough to corrupt men; too much in that no matter how women dress, there is no getting around the fact that many men will lust after them.
Amanda Todd's story is a lot more complicated than an inane, pervasive taboo against exposed breasts. She felt foolish partly because a stranger she trusted betrayed her; she was bullied partly due to violating taboos against promiscuity, not just nudity (taboos that could themselves be the subject of a long critique). But it remains the case that her story wouldn't have been possible save for the flawed norms that make a big deal out of nudity, cloak it in shame and conflate it with especially transgressive promiscuity. Along with the bullying and slut-shaming that helped drive her to suicide, that norm deserves to be attacked. Yes, let's caution our kids about its existence. Let's also teach them that it's incorrect, that the human body is nothing to feel shame over, that the bullies are not merely unkind, but wrong on the merits. Let's raise kids who don't grow up to be offended by nipple slips, topless beaches, or mothers breastfeeding in public, and are therefore less vulnerable to youthful mistakes, rogue photographers, and slut-shaming.
Burning Man is underway in the Nevada desert, the migrant crisis grew in both scale and impact, new Star Wars toys went on sale worldwide, China marked the 70thanniversary of the end of World War II, Alaska’s Mt. McKinley was renamed Denali, and much more.
Conor Friedersdorf is a staff writer at The Atlantic, where he focuses on politics and national affairs. He lives in Venice, California, and is the founding editor of The Best of Journalism, a newsletter devoted to exceptional nonfiction.
Demonizing processed food may be dooming many to obesity and disease. Could embracing the drive-thru make us all healthier?
Late last year, in a small health-food eatery called Cafe Sprouts in Oberlin, Ohio, I had what may well have been the most wholesome beverage of my life. The friendly server patiently guided me to an apple-blueberry-kale-carrot smoothie-juice combination, which she spent the next several minutes preparing, mostly by shepherding farm-fresh produce into machinery. The result was tasty, but at 300 calories (by my rough calculation) in a 16-ounce cup, it was more than my diet could regularly absorb without consequences, nor was I about to make a habit of $9 shakes, healthy or not.
Inspired by the experience nonetheless, I tried again two months later at L.A.’s Real Food Daily, a popular vegan restaurant near Hollywood. I was initially wary of a low-calorie juice made almost entirely from green vegetables, but the server assured me it was a popular treat. I like to brag that I can eat anything, and I scarf down all sorts of raw vegetables like candy, but I could stomach only about a third of this oddly foamy, bitter concoction. It smelled like lawn clippings and tasted like liquid celery. It goes for $7.95, and I waited 10 minutes for it.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
ISIS did not merely blast apart old stones—it attacked the very foundations of pluralistic society.
If the ruined ruins of Palmyra could speak, they would marvel at our shock. After all, they have been sacked before. In their mute and shattered eloquence, they spoke for centuries not only about the cultures that built them but also about the cultures that destroyed them—about the fragility of civilization itself, even when it is incarnated in stone. No designation of sanctity, by God or by UNESCO, suffices to protect the past. The past is helpless. Instead these ruins, all ruins, have had the effect of lifting the past out of history and into time. They carry the spectator away from facts and toward reveries.
In the 18th century, after the publication in London of The Ruins of Palmyra, a pioneering volume of etchings by Robert Wood, who had traveled to the Syrian desert with the rather colorful James Dawkins, a fellow antiquarian and politician, the desolation of Palmyra became a recurring symbol for ephemerality and the vanity of all human endeavors. “It is the natural and common fate of cities,” Wood drily remarked in one of the essays in his book, “to have their memory longer preserved than their ruins.” Wood’s beautiful and meticulous prints served as inspirations for paintings, and it was in response to one of those paintings that Diderot wrote some famous pages in his great Salons of 1767: “The ideas ruins evoke in me are grand. Everything comes to nothing, everything perishes, everything passes, only the world remains, only time endures. ... Wherever I cast my glance, the objects surrounding me announce death and compel my resignation to what awaits me. What is my ephemeral existence in comparison with that of a rock being worn down, of a valley being formed, of a forest that’s dying, of these deteriorating masses suspended above my head? I see the marble of tombs crumble into powder and I don’t want to die!”
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
Heather Armstrong’s Dooce once drew millions of readers. Her blog’s semi-retirement speaks to the challenges of earning money as an individual blogger today.
The success story of Dooce.com was once blogger lore, told and re-told in playgroups and Meetups—anywhere hyper-verbal people with Wordpress accounts gathered. “It happened for that Dooce lady,” they would say. “It could happen for your blog, too.”
Dooce has its origin in the late 1990s, when a young lapsed Mormon named Heather Armstrong taught herself HTML code and moved to Los Angeles. She got a job in web design and began blogging about her life on her personal site, Dooce.com.
The site’s name evolved out of her friends’ AOL Instant-Messenger slang for dude, or its more incredulous cousin, "doooood!” About a year later, Armstrong was fired for writing about her co-workers on the site—an experience that, for a good portion of the ‘aughts, came known as “getting dooced.” She eloped with her now ex-husband, Jon, moved to Salt Lake City, and eventually started blogging full time again.
Encouraging a focus on white identity is a dangerous approach for a country in which white supremacy has been a toxic force.
Donald Trump and the disaffected white people who make up his base of support have got me thinking about race in America. “Trump presents a choice for the Republican Party about which path to follow––” Ben Domenech writes in an insightful piece at The Federalist, “a path toward a coalition that is broad, classically liberal, and consistent with the party’s history, or a path toward a coalition that is reduced to the narrow interests of identity politics for white people.”
When I was growing up in Republican Orange County during the Reagan and Bush Administrations, lots of white parents sat their kids in front of The Cosby Show, explained that black people are just like white people, and inveighed against judging anyone by the color of their skin rather than the content of their character. The approach didn’t convey the full reality of race as minorities experience it. But it represented a significant generational improvement in race relations.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
When Kenneth Jarecke photographed an Iraqi man burned alive, he thought it would change the way Americans saw the Gulf War. But the media wouldn’t run the picture.
The Iraqi soldier died attempting to pull himself up over the dashboard of his truck. The flames engulfed his vehicle and incinerated his body, turning him to dusty ash and blackened bone. In a photograph taken soon afterward, the soldier’s hand reaches out of the shattered windshield, which frames his face and chest. The colors and textures of his hand and shoulders look like those of the scorched and rusted metal around him. Fire has destroyed most of his features, leaving behind a skeletal face, fixed in a final rictus. He stares without eyes.
On February 28, 1991, Kenneth Jarecke stood in front of the charred man, parked amid the carbonized bodies of his fellow soldiers, and photographed him. At one point, before he died this dramatic mid-retreat death, the soldier had had a name. He’d fought in Saddam Hussein’s army and had a rank and an assignment and a unit. He might have been devoted to the dictator who sent him to occupy Kuwait and fight the Americans. Or he might have been an unlucky young man with no prospects, recruited off the streets of Baghdad.
Some Republican candidates are promoting a policy change that would hurt workers by disguising it with a pleasant-sounding phrase.
Americans like their Social Security benefits quite a bit: They oppose cuts to them by a margin of two to one. Even Millennials, who won’t be seeing benefits anytime soon, feel protective of Social Security, according to a poll from the Pew Research Center.
One way to effectively cut Social Security benefits is to raise the age at which they kick in. And yet, when asked specifically about raising the retirement age, Americans are mixed.
Perhaps confusion arises because “raising the age of retirement” sounds like a nice jobs program for older Americans, or an end to forced retirement. I sympathize with that position: Anyone who wants to retire later and work into old age should have a job. But that’s not what raising the retirement age would entail—the fact is, raising the Social Security retirement age represents a reduction in benefits: Because the monthly payments a person receives grow bigger the later in life he or she retires, raising the age cutoff reduces the total amount of money paid out.