Romney's plan only works if you assume he has a different plan or use a magic growth asterisk
Paul Ryan finally had enough time to go through the math of the Romney tax plan during the vice-presidential debate. He didn't use it. Ryan filibustered instead. About the most specific he got was citing "six studies" he said vindicate the plan's mathematical plausibility.
Except they don't.
Romney's tax plan is a three-legged stool that doesn't stand. Here's how it works -- or doesn't. Romney wants to 1) cut tax rates across the board by 20 percent, 2) cut tax expenditures to pay for these tax cuts, and 3) maintain progressivity. The problem, as the Tax Policy Center pointed out, is there aren't enough tax expenditures for the rich to pay for all the tax cuts for the rich. Romney's plan only works if he cuts out the tax cuts for the rich, raises taxes on the middle class, or explodes the deficit. In other words, Romney can pick two, and only two, of his tax goals -- what Matt Yglesias of Slate calls the "Romney Trilemma".
That sound you hear is the three-legged stool falling down.
All this hasn't stopped a fight against the tyranny of arithmetic. The defenses of the Romney tax plan generally fall into three broad categories. The first assumes the plan will set off magic growth of the monster variety; the second assumes Romney defines "middle-class" differently than he does; and the third assumes Romney would eliminate tax expenditures he has indicated he would not eliminate. Let's briefly consider the six such "studies" that Ryan cited -- most are actually blog posts -- in turn.
1. Harvey Rosen paper. Rosen, a professor at Princeton, assumed Romney's lower tax rates would kickstart enough growth to pay for the revenue hole those lower tax rates would create. This seems dubious. Alan Viard and Alex Brill of the conservative American Enterprise Institute (AEI) have argued that it seems unlikely revenue neutral tax reform would have big growth effects -- incentives don't change much if taxes don't even if tax rates do. And besides, the Tax Policy Center used aggressive growth estimates from Romney adviser Greg Mankiw's work to test Romney's plan. It still didn't add up.
2. Marty Feldstein Wall Street Journal op-ed. Former Reagan adviser and current Harvard professor Feldstein argued Romney's plan works if you assume growth would be much stronger and if you define middle class as households making less than $100,000 rather than households making less than $200,000. This latter figure is the one Romney has used when he has said his plan would not raise taxes on the middle class.
3. Marty Feldstein blog post. Feldstein was less aggressive with his growth estimates this time, but he stuck with his definition of middle class as households making less than $100,000. He also assumed Romney might cut tax preferences for employer health-insurance, make municipal bond interest taxable, and eliminate the child tax credit for households making more than $100,000.
4. Matt Jensen blog post at AEI. He argued Romney might cut tax preferences for municipal bonds and life insurance buildups. But this might go against Romney's promise not to cut tax preferences for savings and investment -- and would only pay for half of Romney's revenue hole, according to the Tax Policy Center.
5. Curtis Dubay blog post at Heritage. He argued Romney might cut tax preferences for municipal bonds and life insurance buildups -- yes, again -- and that Romney might tax inheritances on a "carryover basis" after eliminating the estate tax. In plain English, heirs would have to pay capital gains for the price an asset was bought for, rather than the price it was inherited at. But as Suzy Khimm of the Washington Post notes, Dubay overestimates how much revenue this change -- which, remember, is just a guess about what Romney would do -- would generate.
In other words, Romney's plan only works if you assume he has a different plan or use a magic growth asterisk. And that means we have no idea what he would do if he wins. Does he care more about his tax rate cuts, about not hiking taxes on the middle class, or not increasing the deficit? His adviser Kevin Hassett suggested they would back off the high-end tax rate cuts if it would increase the deficit, but Romney quickly denied that. He's also denied reality, by relying on studies that only prove his critics' point.
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. And according to a new report from the U.K.-based Global Challenges Foundation, they’re much more likely than we might think.
In its annual report on “global catastrophic risk,” the nonprofit debuted a startling statistic: Across the span of their lives, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
Partly that’s because the average person will probably not die in an automobile accident. Every year, one in 9,395 people die in a crash; that translates to about a 0.01 percent chance per year. But that chance compounds over the course of a lifetime. At life-long scales, one in 120 Americans die in an accident.
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
That's not a harsh assessment. It's just a fair description.
Millennial politics is simple, really. Young people support big government, unless it costs any more money. They're for smaller government, unless budget cuts scratch a program they've heard of. They'd like Washington to fix everything, just so long as it doesn't run anything.
That's all from a new Reason Foundation poll surveying 2,000 young adults between the ages of 18 and 29. Millennials' political views are, at best, in a stage of constant metamorphosis and, at worst, "totally incoherent," as Dylan Matthews puts it.
It's not just the Reason Foundation. In March, Pew came out with a similar survey of Millennial attitudes that offered another smorgasbord of paradoxes:
Millennials hate the political parties more than everyone else, but they have the highest opinion of Congress.
Young people are the most likely to be single parents and the least likely to approve of single parenthood.
Young people voted overwhelmingly for Obama when he promised universal health care, but they oppose his universal health care law as much as the rest of the country ... even though they still pledge high support for universal health care. (Like other groups, but more so: They seem allergic to the term Obamacare.)
Garry Marshall's patronizing 'holiday anthology' film boasts a star-studded ensemble, but its characters seem barely human.
It’s hard to know where to begin with Mother’s Day, a misshapen Frankenstein of a movie that feels like it escaped the Hallmark headquarters halfway through its creation and rampaged into theaters, trying to teach audiences how to love. The third in Garry Marshall’s increasingly strange “holiday anthology” series, Mother’s Day isn’t the rom-com hodge-podge that Valentine’s Day was, or the bizarre morass of his follow-up New Year’s Eve. But it does inspire the kind of holy terror that you feel all the way down to your bones, or the revolted tingling that strikes one at a karaoke performance gone tragically wrong.
While it’s aiming for frothiness and fun, Mother’s Day is a patronizing and sickly sweet endeavor that widely misses the mark for its entire 118-minute running time (it feels much longer). The audience gets the sense that there are many Big Truths to be learned: that family harmony is important, that it’s good to accept different lifestyles without judgment, that loss is a natural part of the circle of life. But its overall construction—as a work of cinema—always feels a little off. One character gets a life lesson from a clown at a children’s party, and departs with a hearty “Thanks, clown!” Extras wander in the background and deliver halting bits of expositional dialogue like malfunctioning robots. Half of the lines seem to have been recorded post-production and are practically shouted from off-screen to patch over a narrative that makes little sense. Mother’s Day is bad in the regular ways (e.g. the acting and writing), but also in that peculiar way, where it feels as though the film’s creator has never met actual humans before.
The U.S. president talks through his hardest decisions about America’s role in the world.
Friday, August 30, 2013, the day the feckless Barack Obama brought to a premature end America’s reign as the world’s sole indispensable superpower—or, alternatively, the day the sagacious Barack Obama peered into the Middle Eastern abyss and stepped back from the consuming void—began with a thundering speech given on Obama’s behalf by his secretary of state, John Kerry, in Washington, D.C. The subject of Kerry’s uncharacteristically Churchillian remarks, delivered in the Treaty Room at the State Department, was the gassing of civilians by the president of Syria, Bashar al-Assad.
Borrowing from other cultures isn’t just inevitable, it’s potentially positive.
Sometime during the early 2000s, big, gold, “door-knocker” hoop earrings started to appeal to me, after I’d admired them on girls at school. It didn’t faze me that most of the girls who wore these earrings at my high school in St. Louis were black, unlike me. And while it certainly may have occurred to me that I—a semi-preppy dresser—couldn’t pull them off, it never occurred to me that I shouldn’t.
The Fair Credit Reporting Act was intended to protect privacy, but its provisions have not kept pace with the radical changes wrought by the information age.
In America, surveillance has always played an outsized role in the relationship between creditors and debtors. In the 19th century, credit bureaus pioneered mass-surveillance techniques. Today the American debtor faces remote kill switches in their devices, GPS tracking on their leased cars, and surreptitious webcam recordings from their rent-to-own laptops. And where our buying and borrowing habits were once tracked by shopkeepers, our computers score our creditworthiness without us knowing.
The most egregious privacy violations have been punished either by the Federal Trade Commission, or answered with massive class-action lawsuits. But surveillance, tracking, and data collection continues to proliferate. The law has not yet met the challenge of protecting consumers. The capabilities of today’s technology might be unprecedented, but the quandary is an old one. The ways our financial data gets collected and used today is reminiscent of the state of affairs that led to the Fair Credit Reporting Act of 1970.