Repackaging the Bush agenda, just with austerity, is not the path to prosperity.
Romney economic adviser Glenn Hubbard apparently has a very short memory.
In a Wall Street Journalop-ed making the case for Romney's economic agenda, Hubbard presents a strikingly ahistorical account of the past few years -- not to mention sprinkling in one big questionable assumption. Let's take a tour of some of the lowlights.
"We are currently in the most anemic economic recovery in the memory of most Americans."
Does the memory of most Americans go back a decade? If it does, then they can remember a more anemic recovery -- at least when it comes to jobs. The post-2001 recovery had the slowest job growth of any postwar recovery. It also had the slowest private sector growth of any postwar recovery. It's puzzling that Hubbard doesn't remember this, considering that he was the chair of President George W. Bush's Council of Economic Advisors from 2001 to 2003.
Now, the economy did grow faster then than it has now. But that's because the government grew as much as it did then; it's shrinking now. Really. So why does this weak recovery feel weaker than that weak recovery? Well, the tech bubble recession was much milder than the housing bubble recession -- in other words, we're in a deeper hole this time around. All else equal, we would expect a better recovery from a worse recession, but all else is not equal. As Harvard professor Kenneth Rogoff has shown with over 800 years of data, recoveries from financial crises are long, slow slogs. It's doubtful that recycling Bush-era policies will get us out of this ditch faster. It didn't ten years ago.
"[U]ncertainty over policy--particularly over tax and regulatory policy--slowed the recovery and limited job creation. One recent study by Scott Baker and Nicholas Bloom of Stanford University and Steven Davis of the University of Chicago found that this uncertainty reduced GDP by 1.4% in 2011 alone."
Well, that certainly sounds bad. When did all of this uncertainty peak? Let's look at the paper. August of 2011. Hmmm. What happened in August of 2011? Oh, that's right. The debt ceiling debacle. Why don't we let the authors speak for themselves. Here's why they said uncertainty was so elevated in 2011:
A series of later developments and policy fights - including the debt- ceiling dispute between Republicans and Democrats in the summer of 2011, and ongoing banking and sovereign debt crises in the Eurozone area - kept economic policy uncertainty at very high levels throughout 2011.
In other words, a debt crisis the Republicans manufactured and a debt crisis the Europeans manufactured drove uncertainty in 2011. Granted, tax uncertainty has been bad -- but so has monetary policy uncertainty. And have you noticed what we haven't talked about yet? The authors conclude that healthcare and financial regulation uncertainty were "much less pronounced" than all of the above questions.
And according to the Congressional Budget Office, the large deficits codified in the president's budget would reduce GDP during 2018-2022 by between 0.5% and 2.2% compared to what would occur under current law. [...]
The governor's plan would reduce federal spending as a share of GDP to 20%--its pre-crisis average--by 2016. This would dramatically reduce policy uncertainty over the need for future tax increases, thus increasing business and consumer confidence. [...]
The Romney plan would reduce individual marginal income tax rates across the board by 20%, while keeping current low tax rates on dividends and capital gains. The governor would also reduce the corporate income tax rate--the highest in the world--to 25%. In addition, he would broaden the tax base to ensure that tax reform is revenue-neutral.
Hubbard says that 1) Medium-run deficits are bad for medium-run growth, 2) Romney will cut public spending, which will increase private spending, and 3) Romney will lower tax rates and eliminate tax loopholes while keeping tax revenues the same. Individually, these might make sense. Together, they're the economic equivalent of saying two plus two equals five.
Let's unpack this fiscal mess. Romney wants to cut taxes, but he also wants to cut medium-run deficits too. That's a problem. His answer: He won't cut taxes, but tax rates -- while cutting spending too. But this creates new problems. For one, it means his tax plan will raise taxes on the bottom 95 percent, while cutting them for the top 5 percent. For another, it leaves Romney stuck embracing spending cuts that will hurt the economy.
Expansionary austerity is a myth, at least in the short-term. That was the conclusion the IMF reached in a 2011 paper that examined 173 cases of fiscal retrenchment over the past 30 years. On average, cutting the deficit by 1 percent of GDP led to a 0.5 percentage point increase in unemployment -- with private spending falling in tandem with public spending. Austerity can work over the longer-term, as long as interest rates or the currency falls to offset the fall in government spending. But interest rates are already at zero, and Republicans aren't too keen about quantitative easing or that whole "dollar depreciation" thing. That leaves the Romney camp with one final reason why cutting government spending would lead to more spending overall: Ricardian equivalence. It's the idea that the private sector spends less when the public sector borrows more, because households know that eventually the government will have to raise taxes to pay for that borrowing. The empirical evidence on this is mixed -- after all, few households 1) know enough about the deficit to predict what will happen to their taxes, or 2) have enough disposable income or access to borrowing to smooth their lifetime spending. That's not to say that there isn't something to it, but that it's a flimsy hope for the catch-up growth we need.
I don't mean to pick on Glenn Hubbard. He has plenty of good ideas about how to get the economy moving again -- like mass refinancing for mortgages owned by Fannie and Freddie. But repackaging the Bush agenda, just updated with austerity, is not the path to prosperity.
Escalating tensions between the country and the United States lead toward the threat of a "Juche bird" test in the Pacific.
The schoolyard-level taunting at the UN General Assembly sparked by President Trump’s threat to “totally destroy North Korea” if the U.S. is forced to defend itself or its allies was taken up a notch Friday as Kim Jong Un, the North Korean leader, responded in what is perhaps the first personal reply by the North Korean leader. The crux of Kim’s response: “I will surely and definitely tame the mentally deranged U.S. dotard with fire.”
Hyperbolic rhetoric from North Korea about the United States, South Korea, and Japan is not unusual, but Kim’s personal response could indicate the international tensions caused by North Korea’s nuclear and missile programs are at their highest levels yet. The North Korean leader’s response, which was carried by KCNA, the state-run news agency, came a day after the U.S. tightened financial sanctions on his country, and China, the North’s closest ally, reportedly ordered its banks to stop working with North Korean financial institutions—a move that could have a devastating impact on Kim’s regime.
What J.R.R. Tolkien’s classic The Hobbit still has to offer, 80 years after its publication
“In a hole in the ground there lived a hobbit.” So began the legendarium that dominated a genre, changed Western literature and the field of linguistics, created a tapestry of characters and mythology that endured four generations, built an anti-war ethos that endured a World War and a Cold War, and spawned a multibillion-dollar media franchise. J.R.R. Tolkien’s work is probably best remembered today by the sword-and-sandal epic scale of The Lord of The Rings films, but it started in the quiet, fictionalized English countryside of the Shire. It started, 80 years ago in a hobbit-hole, with Bilbo Baggins.
Although Tolkien created the complicated cosmological sprawl of The Silmarillion and stories like the incestuous saga of Túrin Turambar told in The Children of Húrin, Middle-earth itself is mostly remembered today as something akin to little Bilbo in his Hobbit-hole: quaint, virtuous, and tidy. Nowadays, George R.R. Martin’s got the market cornered on heavily initialed fantasy writers, and his hand guides the field. High and epic fantasy are often expected to dip heavily into the medieval muck of realism, to contain heavy doses of sex and curses, gore and grime, sickness and believable motives and set pieces. Characters like Martin’s mercenary Bronn of the Blackwater are expected to say “fuck.” Modern stories, even when set in lands like A Song of Ice and Fire’s Essos that are filled with competing faiths, tend toward the nihilist, and mostly atheist. Heavenly beings are denuded of potency and purity; while the gods may not be dead, divinity certainly is.
Girls in the Middle East do better than boys in school by a greater margin than almost anywhere else in the world: a case study in motivation, mixed messages, and the condition of boys everywhere.
Jordan has never had a female minister of education, women make up less than a fifth of its workforce, and women hold just 4 percent of board seats at public companies there. But, in school, Jordanian girls are crushing their male peers. The nation’s girls outperform its boys in just about every subject and at every age level. At the University of Jordan, the country’s largest university, women outnumber men by a ratio of two to one—and earn higher grades in math, engineering, computer-information systems, and a range of other subjects.
In fact, across the Arab world, women now earn more science degrees on a percentage basis than women in the United States. In Saudi Arabia alone, women earn half of all science degrees. And yet, most of those women are unlikely to put their degrees to paid use for very long.
A new film details the reason the star postponed her recent tour—and will test cultural attitudes about gender, pain, and pop.
“Pain without a cause is pain we can’t trust,” the author Leslie Jamison wrote in 2014. “We assume it’s been chosen or fabricated.”
Jamison’s essay “Grand Unified Theory of Female Pain” unpacked the suffering-woman archetype, which encompasses literature’s broken hearts (Anna Karenina, Miss Havisham) and society’s sad girls—the depressed, the anorexic, and in the 19th century, the tubercular. Wariness about being defined by suffering, she argued, had led many modern women to adopt a new pose. She wrote, “The post-wounded woman conducts herself as if preempting certain accusations: Don’t cry too loud; don’t play victim.” Jamison questioned whether this was an overcorrection. “The possibility of fetishizing pain is no reason to stop representing it,” she wrote. “Pain that gets performed is still pain.”
The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.
It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
Its faith-based 12-step program dominates treatment in the United States. But researchers have debunked central tenets of AA doctrine and found dozens of other treatments more effective.
J.G. is a lawyer in his early 30s. He’s a fast talker and has the lean, sinewy build of a distance runner. His choice of profession seems preordained, as he speaks in fully formed paragraphs, his thoughts organized by topic sentences. He’s also a worrier—a big one—who for years used alcohol to soothe his anxiety.
J.G. started drinking at 15, when he and a friend experimented in his parents’ liquor cabinet. He favored gin and whiskey but drank whatever he thought his parents would miss the least. He discovered beer, too, and loved the earthy, bitter taste on his tongue when he took his first cold sip.
His drinking increased through college and into law school. He could, and occasionally did, pull back, going cold turkey for weeks at a time. But nothing quieted his anxious mind like booze, and when he didn’t drink, he didn’t sleep. After four or six weeks dry, he’d be back at the liquor store.
What feels like information overload reveals how little the public actually knows about the probe's findings.
Robert Mueller has stayed busy with his special-counsel investigation all summer, but the rest of Washington took a vacation. And since most information about Mueller’s actions seems to come from leaks outside the Mueller team, that meant there was a stretch of relative silence.
But the lull is over now. The month of September, and particularly the last week, have seen a torrent of new revelations about Mueller’s investigation. The fresh information gives the most complete view of what Mueller is up to and where he might be focusing, and in particular on the person of Paul Manafort, who chaired Donald Trump’s presidential campaign during the summer of 2016. Yet even as they suggest the direction in which the probe is headed at the moment, they don’t offer much insight into the ultimate questions of when Mueller might wrap up and what, if any, charges he might bring or recommend. So where does that leave things?
In policy, as in military strategy, the first two epistemological categories are acceptable: People either know exactly how a policy will work, or they can make educated guesses based on data parameters they can’t quite know for certain. The unknown unknowns—what we don’t know we don’t know—are the problems, the things that could derail an entire policy and, in the process, ruin lives. Traditionally, the goal in lawmaking has been to eliminate the mystery from legislation so that there are as few unknown unknowns as possible. But, as it turns out, tradition can be easily broken.
Two new books explore America’s changing romantic landscape.
C.S. Lewis’s wife, Joy Davidman, died of bone cancer on July 13, 1960. The next day, the famous author wrote a letter to Peter Bide, the reverend who had married them, to tell him the news.
“I’d like to meet,” Lewis writes, suggesting the two grab lunch sometime soon. “For I am—oh God that I were not—very free now. One doesn’t realize in early life that the price of freedom is loneliness. To be happy is to be tied.”
When it comes to romance, Americans are freer than they’ve ever been. Freer to marry, freer to divorce, freer to have sex when and with whom they like with fewer consequences, freer to cohabitate without getting married, freer to remain single, freer to pursue open relationships or polyamory.