Repackaging the Bush agenda, just with austerity, is not the path to prosperity.
Romney economic adviser Glenn Hubbard apparently has a very short memory.
In a Wall Street Journalop-ed making the case for Romney's economic agenda, Hubbard presents a strikingly ahistorical account of the past few years -- not to mention sprinkling in one big questionable assumption. Let's take a tour of some of the lowlights.
"We are currently in the most anemic economic recovery in the memory of most Americans."
Does the memory of most Americans go back a decade? If it does, then they can remember a more anemic recovery -- at least when it comes to jobs. The post-2001 recovery had the slowest job growth of any postwar recovery. It also had the slowest private sector growth of any postwar recovery. It's puzzling that Hubbard doesn't remember this, considering that he was the chair of President George W. Bush's Council of Economic Advisors from 2001 to 2003.
Now, the economy did grow faster then than it has now. But that's because the government grew as much as it did then; it's shrinking now. Really. So why does this weak recovery feel weaker than that weak recovery? Well, the tech bubble recession was much milder than the housing bubble recession -- in other words, we're in a deeper hole this time around. All else equal, we would expect a better recovery from a worse recession, but all else is not equal. As Harvard professor Kenneth Rogoff has shown with over 800 years of data, recoveries from financial crises are long, slow slogs. It's doubtful that recycling Bush-era policies will get us out of this ditch faster. It didn't ten years ago.
"[U]ncertainty over policy--particularly over tax and regulatory policy--slowed the recovery and limited job creation. One recent study by Scott Baker and Nicholas Bloom of Stanford University and Steven Davis of the University of Chicago found that this uncertainty reduced GDP by 1.4% in 2011 alone."
Well, that certainly sounds bad. When did all of this uncertainty peak? Let's look at the paper. August of 2011. Hmmm. What happened in August of 2011? Oh, that's right. The debt ceiling debacle. Why don't we let the authors speak for themselves. Here's why they said uncertainty was so elevated in 2011:
A series of later developments and policy fights - including the debt- ceiling dispute between Republicans and Democrats in the summer of 2011, and ongoing banking and sovereign debt crises in the Eurozone area - kept economic policy uncertainty at very high levels throughout 2011.
In other words, a debt crisis the Republicans manufactured and a debt crisis the Europeans manufactured drove uncertainty in 2011. Granted, tax uncertainty has been bad -- but so has monetary policy uncertainty. And have you noticed what we haven't talked about yet? The authors conclude that healthcare and financial regulation uncertainty were "much less pronounced" than all of the above questions.
And according to the Congressional Budget Office, the large deficits codified in the president's budget would reduce GDP during 2018-2022 by between 0.5% and 2.2% compared to what would occur under current law. [...]
The governor's plan would reduce federal spending as a share of GDP to 20%--its pre-crisis average--by 2016. This would dramatically reduce policy uncertainty over the need for future tax increases, thus increasing business and consumer confidence. [...]
The Romney plan would reduce individual marginal income tax rates across the board by 20%, while keeping current low tax rates on dividends and capital gains. The governor would also reduce the corporate income tax rate--the highest in the world--to 25%. In addition, he would broaden the tax base to ensure that tax reform is revenue-neutral.
Hubbard says that 1) Medium-run deficits are bad for medium-run growth, 2) Romney will cut public spending, which will increase private spending, and 3) Romney will lower tax rates and eliminate tax loopholes while keeping tax revenues the same. Individually, these might make sense. Together, they're the economic equivalent of saying two plus two equals five.
Let's unpack this fiscal mess. Romney wants to cut taxes, but he also wants to cut medium-run deficits too. That's a problem. His answer: He won't cut taxes, but tax rates -- while cutting spending too. But this creates new problems. For one, it means his tax plan will raise taxes on the bottom 95 percent, while cutting them for the top 5 percent. For another, it leaves Romney stuck embracing spending cuts that will hurt the economy.
Expansionary austerity is a myth, at least in the short-term. That was the conclusion the IMF reached in a 2011 paper that examined 173 cases of fiscal retrenchment over the past 30 years. On average, cutting the deficit by 1 percent of GDP led to a 0.5 percentage point increase in unemployment -- with private spending falling in tandem with public spending. Austerity can work over the longer-term, as long as interest rates or the currency falls to offset the fall in government spending. But interest rates are already at zero, and Republicans aren't too keen about quantitative easing or that whole "dollar depreciation" thing. That leaves the Romney camp with one final reason why cutting government spending would lead to more spending overall: Ricardian equivalence. It's the idea that the private sector spends less when the public sector borrows more, because households know that eventually the government will have to raise taxes to pay for that borrowing. The empirical evidence on this is mixed -- after all, few households 1) know enough about the deficit to predict what will happen to their taxes, or 2) have enough disposable income or access to borrowing to smooth their lifetime spending. That's not to say that there isn't something to it, but that it's a flimsy hope for the catch-up growth we need.
I don't mean to pick on Glenn Hubbard. He has plenty of good ideas about how to get the economy moving again -- like mass refinancing for mortgages owned by Fannie and Freddie. But repackaging the Bush agenda, just updated with austerity, is not the path to prosperity.
Is there anything inherently “doggy” about the word “dog”? Obviously not—to the French, a dog is a chien, to Russians a sobaka, to Mandarin Chinese-speakers a gǒu. These words have nothing in common, and none seem any more connected to the canine essence than any other. One runs up against that wall with pretty much any word.
Except some. The word for “mother” seems often either to be mama or have a nasal sound similar to m, like nana. The word for “father” seems often either to be papa or have a sound similar to p, like b, in it—such that you get something like baba. The word for “dad” may also have either d or t, which is a variation on saying d, just as p is on b. People say mama or nana, and then papa, baba, dada, or tata,worldwide.
Before it became the New World, the Western Hemisphere was vastly more populous and sophisticated than has been thought—an altogether more salubrious place to live at the time than, say, Europe. New evidence of both the extent of the population and its agricultural advancement leads to a remarkable conjecture: the Amazon rain forest may be largely a human artifact
The plane took off in weather that was surprisingly cool for north-central Bolivia and flew east, toward the Brazilian border. In a few minutes the roads and houses disappeared, and the only evidence of human settlement was the cattle scattered over the savannah like jimmies on ice cream. Then they, too, disappeared. By that time the archaeologists had their cameras out and were clicking away in delight.
Below us was the Beni, a Bolivian province about the size of Illinois and Indiana put together, and nearly as flat. For almost half the year rain and snowmelt from the mountains to the south and west cover the land with an irregular, slowly moving skin of water that eventually ends up in the province's northern rivers, which are sub-subtributaries of the Amazon. The rest of the year the water dries up and the bright-green vastness turns into something that resembles a desert. This peculiar, remote, watery plain was what had drawn the researchers' attention, and not just because it was one of the few places on earth inhabited by people who might never have seen Westerners with cameras.
Science says lasting relationships come down to—you guessed it—kindness and generosity.
Every day in June, the most popular wedding month of the year, about 13,000 American couples will say “I do,” committing to a lifelong relationship that will be full of friendship, joy, and love that will carry them forward to their final days on this earth.
Except, of course, it doesn’t work out that way for most people. The majority of marriages fail, either ending in divorce and separation or devolving into bitterness and dysfunction. Of all the people who get married, only three in ten remain in healthy, happy marriages, as psychologist Ty Tashiro points out in his book The Science of Happily Ever After, which was published earlier this year.
Social scientists first started studying marriages by observing them in action in the 1970s in response to a crisis: Married couples were divorcing at unprecedented rates. Worried about the impact these divorces would have on the children of the broken marriages, psychologists decided to cast their scientific net on couples, bringing them into the lab to observe them and determine what the ingredients of a healthy, lasting relationship were. Was each unhappy family unhappy in its own way, as Tolstoy claimed, or did the miserable marriages all share something toxic in common?
No defensible moral framework regards foreigners as less deserving of rights than people born in the right place at the right time.
To paraphrase Rousseau, man is born free, yet everywhere he is caged. Barbed-wire, concrete walls, and gun-toting guards confine people to the nation-state of their birth. But why? The argument for open borders is both economic and moral. All people should be free to move about the earth, uncaged by the arbitrary lines known as borders.
Not every place in the world is equally well-suited to mass economic activity. Nature’s bounty is divided unevenly. Variations in wealth and income created by these differences are magnified by governments that suppress entrepreneurship and promote religious intolerance, gender discrimination, or other bigotry. Closed borders compound these injustices, cementing inequality into place and sentencing their victims to a life of penury.
The standard conception of the disorder is based on studies of "hyperactive young white boys." For females, it comes on later, and has different symptoms.
When you live in total squalor—cookies in your pants drawer, pants in your cookies drawer, and nickels, dresses, old New Yorkers, and apple seeds in your bed—it’s hard to know where to look when you lose your keys. The other day, after two weeks of fruitless searching, I found my keys in the refrigerator on top of the roasted garlic hummus. I can’t say I was surprised. I was surprised when my psychiatrist diagnosed me with ADHD two years ago, when I was a junior at Yale.
In editorials and in waiting rooms, concerns of too-liberal diagnoses and over-medication dominate our discussions of attention deficit hyperactivity disorder, or ADHD. The New York Timesrecently reported, with great alarm, the findings of a new Centers for Disease Control and Prevention study: 11 percent of school-age children have received an ADHD diagnosis, a 16 percent increase since 2007. And rising diagnoses mean rising treatments—drugs like Adderall and Ritalin are more accessible than ever, whether prescribed by a physician or purchased in a library. The consequences of misuse and abuse of these drugs are dangerous, sometimes fatal.
The Islamic State has made enemies of most of the world. So how is it still winning?
Nearly two millennia ago, the Romans built the Arch of Triumph in Palmyra, Syria. According to Picturesque Palestine, Sinai, and Egypt, published in 1881, “The wonder in these ancient ruins is not that so much has fallen, but that anything remains.” Last week, ISIS blew the Arch of Triumph, which the group considers idolatrous, to pieces. Such acts of aggression and barbarism have mobilized a vast enemy coalition, which includes almost every regional power and virtually every great power (and notably the United States, often compared to the Roman Empire in its hegemonic strength). Yet, incredibly, this alliance seems incapable of rolling back the Islamic State. How can a group of insurgents declare war on humanity—and win?
Even in big cities like Tokyo, small children take the subway and run errands by themselves. The reason has a lot to do with group dynamics.
It’s a common sight on Japanese mass transit: Children troop through train cars, singly or in small groups, looking for seats.
They wear knee socks, polished patent-leather shoes, and plaid jumpers, with wide-brimmed hats fastened under the chin and train passes pinned to their backpacks. The kids are as young as 6 or 7, on their way to and from school, and there is nary a guardian in sight.
A popular television show called Hajimete no Otsukai, or My First Errand, features children as young as two or three being sent out to do a task for their family. As they tentatively make their way to the greengrocer or bakery, their progress is secretly filmed by a camera crew. The show has been running for more than 25 years.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Forget the Common Core, Finland’s youngsters are in charge of determining what happens in the classroom.
“The changes to kindergarten make me sick,” a veteran teacher in Arkansas recently admitted to me. “Think about what you did in first grade—that’s what my 5-year-old babies are expected to do.”
The difference between first grade and kindergarten may not seem like much, but what I remember about my first-grade experience in the mid-90s doesn’t match the kindergarten she described in her email: three and a half hours of daily literacy instruction, an hour and a half of daily math instruction, 20 minutes of daily “physical activity time” (officially banned from being called “recess”) and two 56-question standardized tests in literacy and math—on the fourth week of school.
That American friend—who teaches 20 students without an aide—has fought to integrate 30 minutes of “station time” into the literacy block, which includes “blocks, science, magnetic letters, play dough with letter stamps to practice words, books, and storytelling.” But the most controversial area of her classroom isn’t the blocks nor the stamps: Rather, it’s the “house station with dolls and toy food”—items her district tried to remove last year. The implication was clear: There’s no time for play in kindergarten anymore.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.