The controversy reveals that we wrongly assume, with dangerous implications for public health, that women who get breast implants must be rich.
A Venezuelan woman sits next to the two PIP breast implants that she had removed. The yellow implant at left, made by Poly Implant Prothese, had broken. / AP
the past few months, the leak-prone breast
implants of French company Poly Implant Prothese has turned into an international furor. At issue
is the company's use of non-medical-grade silicone, which has an
increased risk of rupture. The gel inside the implant, once released,
can inflame the surrounding tissue. Though there's limited evidence for this, fears
persist about the irritation leading to an increased risk for
cancer. The scope of the potential impact is tremendous. As of last
week, the estimate cited by
the BBC was that "up to 400,000 women in 65 countries are believed to
have been given implants" from the company. PIP founder Jean-Claude Mas is now facing
charges of involuntary injury, while policy professionals, journalists,
and the public are asking how the implants could have made it past
safety inspectors to have reached so many women.
there were ever a time to move beyond our dangerously facile debate about
cosmetic surgery, it's now. European media have been hammering the point that it's time to take implant safety as seriously as
drug safety, and take cosmetic procedures as seriously as any other operations, which is what they
To answer the question of how the
implants could have made it to so many women, though, one has to ask how
and it is that so many women are getting breast implants at all. Paris-based plastic surgery and dermatologist
organization IMCAS recently released some new numbers that help explain. Cosmetic surgeries not only rose by 10.1 percent in 2011 but are expected to
rise by another 11.12 percent in 2012, despite the scare.
A significant portion of the debate touches on a longstanding theme of how people think about cosmetic surgery: who, if anyone, should help cover the costs for removing and replacing defective implants? Though insurance policies and national
governments have already declared themselves willing to foot the bill for the
faulty implants' removal, that doesn't come without caveats. The implicit moral question the responsible officials seem to be asking themselves is, Should governments compensate for losses in botched vanity
projects? And herein lies the need
for a more careful look at the phenomenon of plastic surgery.
France, the government will only pay for new implants if the originals
were for reconstructive surgery. Politicians in Germany have been urging
similar policies. Elizabeth Niejahr neatly summarized this thinking in Die Zeit as "one
shouldn't make cosmetic surgeries even more popular. ... Those who, out
of vanity, decide to undergo the knife, should be aware of the
consequences." SPD Carola Reimann, Niejahr pointed out, has also argued
that "It's about the beauty ideal and the pressure to conform."
These politicians have a point, namely about moral hazard. But behind these sentiments lies a deep confusion about plastic surgery that's worth
surfacing. The idea that implants are for "vanity" seems to imply selfishness and, with it, an exercise
of will. But the charge that implants are about a "pressure to conform" implies the opposite. Which is driving
the trend towards plastic surgery? A projected growth in surgeries,
despite the dire stories of the past year, begins to look like a
pathology not just in individual women and men, but in society itself;
if that's the case, how helpful is it to blame individuals for
succumbing to what appears to be a mass psychosis?
criticizing the French and potential German positions, makes an
important related point: many politicians are assuming that the women
paying for non-reconstructive implants must be rich, and are adjusting
their rhetoric accordingly. But a glance merely at "trash talk shows,"
Niejahr notes, suggests this is "a false picture." How? "There may be many
women who save for new breasts or with what little credit they have
choose a larger chest over a new car."
just an inaccurate image: the suggestion that women who get breast
implants must be rich is a dangerous misconception with real implications. The enormous black market in cosmetic surgery, as
well as the apparently flourishing cosmetic surgery tourism trade -- with
terrifying stories of incompetently executed, dangerous procedures -- should be
evidence enough, even without Niejahr's trashy TV.
This isn't to say that governments should pay
for implant replacements (though Niejahr does make that argument): it's
questionable fiscal policy to pay for implant replacements in the
current European economic climate, even before you get to the possible
moral hazard argument. But in the debate over the appropriate policy
position, European politicians do need to be careful about the
assumptions they convey in their rhetoric.
these few critics in the current Continental debate show, this may be the
perfect time to probe the dark undercurrents of plastic surgery trends.
Tighter regulations may reduce dangers within the European Union, but they don't
change the fact that these surgeries still carry risks -- and they're
definitely not going to help the women who head to Mexico. Double-D
millionaires aren't a public health problem -- but they are a disturbingly convenient fiction.
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
Shedding pounds is usually a losing battle—research suggests it’s better to just focus on building a healthy lifestyle.
“My own history of yo-yo dieting started when I was 15 and lasted about three decades,” said Sandra Aamodt, a neuroscientist and the author of Why Diets Make Us Fat, at the Aspen Ideas Festival on Saturday. “I lost the same 15 pounds pretty much every year during that same period, and gained it back regular as clockwork.”
This is a classic tale—the diet that doesn’t take, the weight loss that comes right back. The most recent, extreme, highly publicized case was that of the study done on contestants from the reality show The Biggest Loser, most of whom, six years after losing 100 to 200 pounds, had gained most of it back, and had significantly slowed metabolisms.
The study provided a dramatic example of how the body fights against weight loss. And sheer force of will is rarely sufficient to fight back.
The June 23 vote represents a huge popular rebellion against a future in which British people feel increasingly crowded within—and even crowded out of—their own country.
I said goodnight to a gloomy party of Leave-minded Londoners a few minutes after midnight. The paper ballots were still being counted by hand. Only the British overseas territory of Gibraltar had reported final results. Yet the assumption of a Remain victory filled the room—and depressed my hosts. One important journalist had received a detailed briefing earlier that evening of the results of the government’s exit polling: 57 percent for Remain.
The polling industry will be one victim of the Brexit vote. A few days before the vote, I met with a pollster who had departed from the cheap and dirty methods of his peers to perform a much more costly survey for a major financial firm. His results showed a comfortable margin for Remain. Ten days later, anyone who heeded his expensive advice suffered the biggest percentage losses since the 2008 financial crisis.
The kerfuffle over Kim Kardashian's drug-promoting Instagram selfie is nothing new: As long as the agency has existed, it's had to figure out how to regulate drug advertisements in new forms of communication technology.
Last month, celebrity-news and health-policy bloggers had a rare moment of overlap after the Food and Drug Administration issued a warning letter to the pharmaceutical company Duchesnay, which manufactures Diclegis, a prescription-only anti-nausea pill. At stake: a single selfie with pill bottle.
The image that attracted the censure of the FDA was an Instagram posted on July 20 by Kim Kardashian. The image featured her upper torso, right hand, and face, with a bottle of Diclegis prominently displayed in her grasp. “OMG,” the caption began:
Have you heard about this? As you guys know my #morningsickness has been pretty bad. I tried changing things about my lifestyle and my diet, but nothing helped, so I talked to my doctor. He prescribed my Diclegis, I felt better, and most importantly it’s been studied and there is no increased risk to the baby.
Patrick Griffin, his chief congressional affairs lobbyist, recalls the lead up to the bill’s passage in 1994—and the steep political price that followed.
For those who question whether anything will ever be done to curb the use of military grade weaponry for mass shootings in the United States, history provides some good news—and some bad. The good news is that there is, within the recent past, an example of a president—namely Bill Clinton—who successfully wielded the powers of the White House to institute a partial ban of assault weapons from the nation’s streets. The bad news, however, is that Clinton’s victory proved to be so costly to him and to his party that it stands as an enduring cautionary tale in Washington about the political dangers of taking on the issue of gun control.
In 1994, Clinton signed into law the Public Safety and Recreational Firearms Use Protection Act, placing restrictions on the number of military features a gun could have and banning large capacity magazines for consumer use. Given the potent dynamics of Second Amendment politics, it was a signal accomplishment. Yet the story behind the ban has been largely forgotten since it expired in 2004 and, in part, because the provision was embedded in the larger crime bill.
The U.K.’s vote to leave the European Union betrays a failure of empathy and imagination among its leaders. Will America’s political establishment fare any better?
If there is a regnant consensus among the men and women who steer the Western world, it is this: The globe is flattening. Borders are crumbling. Identities are fluid. Commerce and communications form the warp and woof, weaving nations into the tight fabric of a global economy. People are free to pursue opportunity, enriching their new homes culturally and economically. There may be painful dislocations along the way, but the benefits of globalization heavily outweigh its costs. And those who cannot see this, those who would resist it, those who would undo it—they are ignorant of their own interests, bigoted, xenophobic, and backward.
So entrenched is this consensus that, for decades, in most Western democracies, few mainstream political parties have thought to challenge it. They have left it to the politicians on the margins of the left and the right to give voice to such sentiments—and voicing such sentiments relegated politicians to the margins of political life.
How the Brexit vote activated some of the most politically destabilizing forces threatening the U.K.
Among the uncertainties unleashed by the Brexit referendum, which early Friday morning heralded the United Kingdom’s coming breakup with the European Union, was what happens to the “union” of the United Kingdom itself. Ahead of the vote, marquee campaign themes included, on the “leave” side, the question of the U.K.’s sovereignty within the European Union—specifically its ability to control migration—and, on the “remain” side, the economic benefits of belonging to the world’s largest trading bloc, as well as the potentially catastrophic consequences of withdrawing from it. Many of the key arguments on either side concerned the contours of the U.K.-EU relationship, and quite sensibly so. “Should the United Kingdom remain a member of the European Union or leave the European Union?” was, after all, the precise question people were voting on.
Thoughts on the first episode of ESPN’s five-part documentary
Every fall Sunday, when I was a kid, half an hour before the pre-game shows and an hour before the games themselves, I would tune into the latest offering from NFL Films. This was the pre-pre-game show—an assembly of short films derived from the massive archive of professional football. Steve Sabol, whose father founded NFL Films, would preside. He’d offer and then throw it to Jon Facenda or Jefferson Kaye, who would narrate the career highlights of players likeGale Sayers, Earl Campbell, or Dick “Night Train” Lane.
“Highlights” understates what NFL films was actually doing. The shorts were drawn from some the most beautifully shot footage in all of sports. It wasn’t unheard of for NFL Films to go high concept—this piece on football and ballet, with cameos from Allen Ginsberg and George Will, may be the definitive example. Great football plays would be injected not with the normal hurrahs, but with poetry. When Facenda, for instance, wanted to introduce a spectacular touchdown run by Marcus Allen, he did so in the omniscient third person: “On came Marcus Allen—running with the night.”