Slashing taxes for the rich and benefits for the poor will only widen the already large gap between the have-a-lots and the have-nots
Do you have a microwave? Maybe a cell phone? Or even--gasp!--air conditioning? Congratulations, you're not really poor, at least according to Romney adviser Kevin Hassett. And that's why we shouldn't worry about inequality -- or so the story goes.
The Occupy movement made inequality a political football the past year, but it's been a policy football for at least a few years. The debate between inequality skeptics and worriers has gone something like this. Income inequality hasn't gotten worse. Yes, it has. Well, what really matters is consumption inequality, and it hasn't gotten worse. Yes, it has. Well, what really, really matters is social mobility, and it hasn't gotten worse. Yes, it has. Well, social mobility might be overrated because rich people have better genes. Really.
That sound you hear is the goalposts getting moved again and again. And now it's Kevin Hassett's turn to try moving them. He starts by denying that income inequality is quite as bad as economists Thomas Picketty and Emmanuel Saez have measured, because they didn't include transfer payments. That might be damning, if the Congressional Budget Office (CBO) hadn't corroborated Picketty and Saez's results even after taxes and transfers. According to the CBO, incomes for the top 1 percent increased 275 percent since 1979, while incomes for the middle 60 percent only increased 40 percent.
Next up in the inequality denial two-step is the claim that consumption inequality hasn't increased as much as income inequality -- not that Hassett thinks income inequality has increased! In other words, the rich getting richer hasn't translated into them buying more stuff than middle and low-income households. There were actually a few studies arguing this back in the mid-aughts -- see this Dirk Krueger and Fabrizio Perri paper -- but recent research has challenged this. A 2012 paper by Orazio Attanasio, Erik Hurst and Luigi Pistaferri corrected for measurement problems in the consumption survey economists use, and found that consumption and income inequality have more or less tracked each other, as the charts below show.
Okay, here's the big question -- so what? So what both if consumption and income inequality have shot up the past few decades? What does it matter for Romney's economic agenda? Nothing, except for the fact that Romney's tax plan is a huge tax cut for the rich and Romney's budget is a huge cut for the poor. The latter gets less attention, but Romney would cut Medicaid by about $1.7 trillion, and his spending cap couldn't help but cut programs like food stamps, unemployment insurance and education by roughly 30 percent. In other words, it would pour some gasoline on inequality, and then shoot a rocket at it.
Even if you think inequality isn't an economic problem -- which it very well may be -- it's certainly a political and social one. An even wider chasm between the have-a-lots, the haves, and the have-nots risks a backlash against all sorts of policies, like free trade, that politicians and economists from both sides of the aisle agree on.
Inequality is real, and it has real consequences. Romney's agenda would make inequality bad enough that even he and his advisers could no longer deny it. And that's saying something.
The kerfuffle over Kim Kardashian's drug-promoting Instagram selfie is nothing new: As long as the agency has existed, it's had to figure out how to regulate drug advertisements in new forms of communication technology.
Last month, celebrity-news and health-policy bloggers had a rare moment of overlap after the Food and Drug Administration issued a warning letter to the pharmaceutical company Duchesnay, which manufactures Diclegis, a prescription-only anti-nausea pill. At stake: a single selfie with pill bottle.
The image that attracted the censure of the FDA was an Instagram posted on July 20 by Kim Kardashian. The image featured her upper torso, right hand, and face, with a bottle of Diclegis prominently displayed in her grasp. “OMG,” the caption began:
Have you heard about this? As you guys know my #morningsickness has been pretty bad. I tried changing things about my lifestyle and my diet, but nothing helped, so I talked to my doctor. He prescribed my Diclegis, I felt better, and most importantly it’s been studied and there is no increased risk to the baby.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
Shedding pounds is usually a losing battle—research suggests it’s better to just focus on building a healthy lifestyle.
“My own history of yo-yo dieting started when I was 15 and lasted about three decades,” said Sandra Aamodt, a neuroscientist and the author of Why Diets Make Us Fat, at the Aspen Ideas Festival on Saturday. “I lost the same 15 pounds pretty much every year during that same period, and gained it back regular as clockwork.”
This is a classic tale—the diet that doesn’t take, the weight loss that comes right back. The most recent, extreme, highly publicized case was that of the study done on contestants from the reality show The Biggest Loser, most of whom, six years after losing 100 to 200 pounds, had gained most of it back, and had significantly slowed metabolisms.
The study provided a dramatic example of how the body fights against weight loss. And sheer force of will is rarely sufficient to fight back.
The June 23 vote represents a huge popular rebellion against a future in which British people feel increasingly crowded within—and even crowded out of—their own country.
I said goodnight to a gloomy party of Leave-minded Londoners a few minutes after midnight. The paper ballots were still being counted by hand. Only the British overseas territory of Gibraltar had reported final results. Yet the assumption of a Remain victory filled the room—and depressed my hosts. One important journalist had received a detailed briefing earlier that evening of the results of the government’s exit polling: 57 percent for Remain.
The polling industry will be one victim of the Brexit vote. A few days before the vote, I met with a pollster who had departed from the cheap and dirty methods of his peers to perform a much more costly survey for a major financial firm. His results showed a comfortable margin for Remain. Ten days later, anyone who heeded his expensive advice suffered the biggest percentage losses since the 2008 financial crisis.
Patrick Griffin, his chief congressional affairs lobbyist, recalls the lead up to the bill’s passage in 1994—and the steep political price that followed.
For those who question whether anything will ever be done to curb the use of military grade weaponry for mass shootings in the United States, history provides some good news—and some bad. The good news is that there is, within the recent past, an example of a president—namely Bill Clinton—who successfully wielded the powers of the White House to institute a partial ban of assault weapons from the nation’s streets. The bad news, however, is that Clinton’s victory proved to be so costly to him and to his party that it stands as an enduring cautionary tale in Washington about the political dangers of taking on the issue of gun control.
In 1994, Clinton signed into law the Public Safety and Recreational Firearms Use Protection Act, placing restrictions on the number of military features a gun could have and banning large capacity magazines for consumer use. Given the potent dynamics of Second Amendment politics, it was a signal accomplishment. Yet the story behind the ban has been largely forgotten since it expired in 2004 and, in part, because the provision was embedded in the larger crime bill.
Demographic data shows that a Briton’s education level may be the strongest indication of how he or she voted.
Britain has voted to leave the European Union. The news surprised many people, including the British, who have learned that while brushing off early statistical warnings is tempting, it doesn’t make it any easier when those warnings turn out to be right. Give yourselves a break, I say: Polls are fickle, anecdote is limited, and prevailing wisdom is sometimes impossible to shake. (Though these remorseful Brexit voters don’t have an excuse.)
There’s a silver lining for statistics, however. With the close of Britain’s referendum, political analysts now have a concrete dataset to examine: the actual vote totals in the United Kingdom. This data, when matched with regional demographic information from the U.K. Census, gives insight into who actually voted to leave or remain.
How the Brexit vote activated some of the most politically destabilizing forces threatening the U.K.
Among the uncertainties unleashed by the Brexit referendum, which early Friday morning heralded the United Kingdom’s coming breakup with the European Union, was what happens to the “union” of the United Kingdom itself. Ahead of the vote, marquee campaign themes included, on the “leave” side, the question of the U.K.’s sovereignty within the European Union—specifically its ability to control migration—and, on the “remain” side, the economic benefits of belonging to the world’s largest trading bloc, as well as the potentially catastrophic consequences of withdrawing from it. Many of the key arguments on either side concerned the contours of the U.K.-EU relationship, and quite sensibly so. “Should the United Kingdom remain a member of the European Union or leave the European Union?” was, after all, the precise question people were voting on.
Thoughts on the first episode of ESPN’s five-part documentary
Every fall Sunday, when I was a kid, half an hour before the pre-game shows and an hour before the games themselves, I would tune into the latest offering from NFL Films. This was the pre-pre-game show—an assembly of short films derived from the massive archive of professional football. Steve Sabol, whose father founded NFL Films, would preside. He’d offer and then throw it to Jon Facenda or Jefferson Kaye, who would narrate the career highlights of players likeGale Sayers, Earl Campbell, or Dick “Night Train” Lane.
“Highlights” understates what NFL films was actually doing. The shorts were drawn from some the most beautifully shot footage in all of sports. It wasn’t unheard of for NFL Films to go high concept—this piece on football and ballet, with cameos from Allen Ginsberg and George Will, may be the definitive example. Great football plays would be injected not with the normal hurrahs, but with poetry. When Facenda, for instance, wanted to introduce a spectacular touchdown run by Marcus Allen, he did so in the omniscient third person: “On came Marcus Allen—running with the night.”