Celebrity historian Niall Ferguson doesn't like President Obama, and doesn't think you should either.
That's perfectly fine. There are plenty of legitimate reasons to disapprove of the president. Here's the big one: 8.3 percent. That's the current unemployment rate, fully three years on from the official end of the Great Recession. But rather than make this straightforward case against the current administration, Ferguson delves into a fantasy world of incorrect and tendentious facts. He simply gets things wrong, again and again and again. (A point my colleague James Fallows makes as well in a must-read.)
Here's a tour of some of the more factually challenged sections of Ferguson's piece.
"Certainly, the stock market is well up (by 74 percent) relative to the close on Inauguration Day 2009. But the total number of private-sector jobs is still 4.3 million below the January 2008 peak."
Did you catch that little switcheroo? Ferguson concedes that stocks have done very well since January 2009, but then says that private sector payrolls have not since January 2008. Notice now? Ferguson blames Obama for job losses that happened a full year before he took office. The private sector has actually added jobs since Obama was sworn in -- 427,000 of them, to be exact. For context, remember that the private sector lost 170,000 jobs during George W. Bush's eight years.
Of course, it's not really fair to blame Obama -- or Bush -- for jobs lost in their first few months before their policies took effect. If we more sensibly look at private sector payrolls after their first six months in office, then Obama has created 3.1 million jobs and Bush created 967,000 jobs.
"Meanwhile real median annual household income has dropped more than 5 percent since June 2009."
I can't replicate this result. It's difficult, because Ferguson does not cite his source. The Census Bureau only has data on real median household incomes through 2010 -- and it shows them falling 2.28 percent from 2009. The Bureau of Labor Statistics has numbers on real median weekly earnings that go through 2012, but those only show a 3.7 percent decrease from June 2009.
"Welcome to Obama's America: nearly half the population is not represented on a taxable return--almost exactly the same proportion that lives in a household where at least one member receives some type of government benefit. We are becoming the 50-50 nation--half of us paying the taxes, the other half receiving the benefits."
It is true that 46 percent of households did not pay federal income tax in 2011. It is not true that they pay no taxes. Federal income taxes account barely account for half of federal taxes, and much less of total taxes, if you count the state and local level. Many of those other taxes can be regressive. If you take all taxes into account, our system is barely progressive at all.
But why do almost half of all households pay no federal income tax? Because they don't have much money to tax. Here's the breakdown from the nonpartisan Tax Policy Center. Half of these households are simply too poor -- they make under $20,000 -- to have any liability. Another quarter are retirees on tax-exempt Social Security benefits. The remaining households have no liability because of tax expenditures like the earned-income tax credit or the child credit.
In other words, the poor, the old, and children. Not exactly the "50-50 nation" of makers and takers -- or "lucky duckies" -- that Ferguson imagines.
"By the end of this year, according to the Congressional Budget Office (CBO), [debt-to-GDP ratio] will reach 70 percent of GDP. These figures significantly understate the debt problem, however. The ratio that matters is debt to revenue. That number has leapt upward from 165 percent in 2008 to 262 percent this year, according to figures from the International Monetary Fund."
This is incorrect. Ferguson had it right the first time -- the number that matters is debt-to-GDP, not debt-to-revenue. The former reflects our capacity to pay; the latter our willingness to pay right now. Moving on.
"Not only did the initial fiscal stimulus fade after the sugar rush of 2009, but the president has done absolutely nothing to close the long-term gap between spending and revenue."
Ferguson wasn't always a critic of the stimulus. Back in August 2009, he wrote that "the stimulus clearly made a significant contribution to stabilizing the U.S. economy." Perhaps he thinks the stimulus should have been bigger so the "sugar rush" would last lasted longer? It's not clear. What is clear is that Obama has tried to close long-term deficits -- several times! And the sequester scheduled for next January is his deal with Republicans to rein in spending. More on that in a bit.
"The most recent estimate for the difference between the net present value of federal government liabilities and the net present value of future federal revenues--what economist Larry Kotlikoff calls the true "fiscal gap"--is $222 trillion."
That's a lot of trillions! But if our fiscal gap is "really" this many trillions, why can we borrow for 30 years for a real rate of 0.64 percent? It's because this number is meaningless. First of all, it seems to project many decades of growth figures and budget decisions that we simply don't know will happen. It assumes the Bush tax cuts never ever expire and that the healthcare cost curve never ever bends. This is like projecting, in 1942, that the Empire of Japan will rule the entire Asian continent for 70 years based on a few years of battle outcomes. It's an interesting prediction, but it's not an empirical vision of the future.
"The country's largest banks are at least $50 billion short of meeting new capital requirements under the new "Basel III" accords governing bank capital adequacy."
This would be damning if we had already fully implemented the Basel III bank rules. We have not. As this handy timeline from Deloitte shows, the bank capital ratios don't take effect until January 2013. And even if they had -- which again, they have not -- it would be a bad idea to change risk-weighted capital too much too soon. Europe's banks have done just that, and the results have left something to be desired. The IMF projects their banks will deleverage some $2.6 trillion over the next year and a half -- starving their economies of credit when they most need it. In other words, Ferguson not only get the facts wrong; he gets the economics wrong too.
"The Patient Protection and Affordable Care Act (ACA) of 2010 did nothing to address the core defects of the system: the long-run explosion of Medicare costs as the baby boomers retire, the "fee for service" model that drives health-care inflation, the link from employment to insurance that explains why so many Americans lack coverage, and the excessive costs of the liability insurance that our doctors need to protect them from our lawyers."
There are reasons to think the ACA will fail to address the core defects of the health care system. But it's wrong to say it does nothing to address them. Here's a partial list of the things Obamacare does. It tackles the long-run explosion of Medicare costs. It tries to move away from the fee-for-service model that drives healthcare inflation. And it cuts the link between employment and insurance. In other words, Obamacare does everything Ferguson says it doesn't do, with the exception of tort reform. Matt Yglesias of Slate has a good explainer on how Obamacare tries to do these things -- everything from IPAB, to Accountable Care Organizations and guaranteed issue. Read it.
"The president pledged that health-care reform would not add a cent to the deficit. But the CBO and the Joint Committee on Taxation now estimate that the insurance-coverage provisions of the ACA will have a net cost of close to $1.2 trillion over the 2012-22 period."
Maybe Ferguson doesn't understand the meaning of the word "deficit"? The only other explanation is that he is deliberately misleading his readers. The CBO is quite clear about Obamacare's budgetary implications. It reduces the deficit. Here's what the CBO said exactly:
[T]he effects of the two laws on direct spending and revenues related to health care will reduce federal deficits by $210 billion over the 2012-2021 period.
In other words, the law is more than paid for. As Paul Krugman pointed out, it does spend $1.042 trillion covering people, but it pays for this coverage by finding savings in Medicare and levying a surtax on investment income for high-earners. That Ferguson looked up the CBO's estimate of the bill's cost and didn't notice that those costs are paid for is peculiar indeed. Even more peculiar is that he is apparently doubling down on this falsehood. And yes, it is a very deliberate falsehood.
"Having set up a bipartisan National Commission on Fiscal Responsibility and Reform, headed by retired Wyoming Republican senator Alan Simpson and former Clinton chief of staff Erskine Bowles, Obama effectively sidelined its recommendations of approximately $3 trillion in cuts and $1 trillion in added revenues over the coming decade. As a result there was no "grand bargain" with the House Republicans--which means that, barring some miracle, the country will hit a fiscal cliff on Jan. 1 as the Bush tax cuts expire and the first of $1.2 trillion of automatic, across-the-board spending cuts are imposed. The CBO estimates the net effect could be a 4 percent reduction in output."
Now, Obama did not push Congress to adopt Simpson-Bowles, but neither did Congress adopt it. Among those who voted against it? Paul Ryan, who Ferguson later lauds for his fiscal courage. Although that wasn't the last attempt at a so-called "grand bargain". That came during the debt ceiling standoff the Republicans forced. Obama offered a long-term deal heavily tilted towards Republican priorities -- read: spending cuts -- that the Republicans spurned. Among those who pushed the Republicans to reject it? Paul Ryan, who worried that a deal would burnish Obama's bipartisan credentials and make his re-election a foregone conclusion.
And then there's the cognitive dissonance of it all. Noah Smith points out that Ferguson reproaches Obama for both running big deficits and for closing them.
"The failures of leadership on economic and fiscal policy over the past four years have had geopolitical consequences. The World Bank expects the U.S. to grow by just 2 percent in 2012. China will grow four times faster than that; India three times faster. By 2017, the International Monetary Fund predicts, the GDP of China will overtake that of the United States."
China has 1.3 billion people. The United States has 300 million people. China's GDP will pass ours when they are only four times poorer than us. That might happen in 2017; it might happen later if China's current slowdown is more than a blip. It doesn't really matter if and when this happens. There's nothing Obama can do to prevent China from catching up -- nor should Obama want to! Economics isn't zero sum. The more money China has, the more money they have to buy things from us and other countries. This is good news, and yet Ferguson treats it like a modern-day equivalent of "losing China".
"In his notorious "you didn't build that" speech, Obama listed what he considers the greatest achievements of big government: the Internet, the GI Bill, the Golden Gate Bridge, the Hoover Dam, the Apollo moon landing, and even (bizarrely) the creation of the middle class. Sadly, he couldn't mention anything comparable that his administration has achieved."
It's bizarre that Ferguson thinks government policies didn't help create America's middle class. America was the first country to make high school compulsory. It was also the first country to make college widely accessible with the G.I. bill. This democratization of education went a long way towards laying the foundation for broad-based prosperity. And as for big things the government has achieved lately, surely moving to near-universal healthcare coverage counts?
In the world as Ferguson describes it, Obama is a big-spending, weak-kneed liberal who can't get the economy turned around. Think Jimmy Carter on steroids. But the world is not as Ferguson describes it. A fact-checked version of the world Ferguson describes reveals a completely different narrative -- a muddy picture of the past four years, where Obama has sometimes cast himself as a stimulator, a deficit hawk, a health care liberal and conservative reformer all at once. And it's a world where the economy is getting better, albeit slowly.
It would have been worthwhile for Ferguson to explain why Obama doesn't deserve re-election in the real world we actually live in. Instead, we got an exercise in Ferguson's specialty -- counterfactual history.
“Somewhere at Google there is a database containing 25 million books and nobody is allowed to read them.”
You were going to get one-click access to the full text of nearly every book that’s ever been published. Books still in print you’d have to pay for, but everything else—a collection slated to grow larger than the holdings at the Library of Congress, Harvard, the University of Michigan, at any of the great national libraries of Europe—would have been available for free at terminals that were going to be placed in every local library that wanted one.
At the terminal you were going to be able to search tens of millions of books and read every page of any book you found. You’d be able to highlight passages and make annotations and share them; for the first time, you’d be able to pinpoint an idea somewhere inside the vastness of the printed record, and send somebody straight to it with a link. Books would become as instantly available, searchable, copy-pasteable—as alive in the digital world—as web pages.
It’s a shame that the standard way of learning how to cook is by following recipes. To be sure, they are a wonderfully effective way to approximate a dish as it appeared in a test kitchen, at a star chef’s restaurant, or on TV. And they can be an excellent inspiration for even the least ambitious home cooks to liven up a weeknight dinner. But recipes, for all their precision and completeness, are poor teachers. They tell you what to do, but they rarely tell you why to do it.
This means that for most novice cooks, kitchen wisdom—a unified understanding of how cooking works, as distinct from the notes grandma lovingly scrawled on index-card recipes passed down through the generations—comes piecemeal. Take, for instance, the basic skill of thickening a sauce. Maybe one recipe for marinara advises reserving some of the starchy pasta water, for adding later in case the sauce is looking a little thin. Another might recommend rescuing a too-watery sauce with some flour, and still another might suggest a handful of parmesan. Any one of these recipes offers a fix under specific conditions, but after cooking through enough of them, those isolated recommendations can congeal into a realization: There are many clever ways to thicken a sauce, and picking an appropriate one depends on whether there’s some leeway for the flavor to change and how much time there is until dinner needs to be on the table.
Film, television, and literature all tell them better. So why are games still obsessed with narrative?
A longstanding dream: Video games will evolve into interactive stories, like the ones that play out fictionally on the Star Trek Holodeck. In this hypothetical future, players could interact with computerized characters as round as those in novels or films, making choices that would influence an ever-evolving plot. It would be like living in a novel, where the player’s actions would have as much of an influence on the story as they might in the real world.
It’s an almost impossible bar to reach, for cultural reasons as much as technical ones. One shortcut is an approach called environmental storytelling. Environmental stories invite players to discover and reconstruct a fixed story from the environment itself. Think of it as the novel wresting the real-time, first-person, 3-D graphics engine from the hands of the shooter game. In Disneyland’s Peter Pan’s Flight, for example, dioramas summarize the plot and setting of the film. In the 2007 game BioShock, recorded messages in an elaborate, Art Deco environment provide context for a story of a utopia’s fall. And in What Remains of Edith Finch, a new game about a girl piecing together a family curse, narration is accomplished through artifacts discovered in an old house.
Will you pay more for those shoes before 7 p.m.? Would the price tag be different if you lived in the suburbs? Standard prices and simple discounts are giving way to far more exotic strategies, designed to extract every last dollar from the consumer.
As Christmas approached in 2015, the price of pumpkin-pie spice went wild. It didn’t soar, as an economics textbook might suggest. Nor did it crash. It just started vibrating between two quantum states. Amazon’s price for a one-ounce jar was either $4.49 or $8.99, depending on when you looked. Nearly a year later, as Thanksgiving 2016 approached, the price again began whipsawing between two different points, this time $3.36 and $4.69.
We live in the age of the variable airfare, the surge-priced ride, the pay-what-you-want Radiohead album, and other novel price developments. But what was this? Some weird computer glitch? More like a deliberate glitch, it seems. “It’s most likely a strategy to get more data and test the right price,” Guru Hariharan explained, after I had sketched the pattern on a whiteboard.
They’re stuck between corporations trying to extract maximum profits from each flight and passengers who can broadcast their frustration on social media.
Two weeks ago, a man was violently dragged off a United Airlines flight after being told it was overbooked. And late last week, American Airlines suspended a flight attendant after a fight nearly broke out between a passenger and the crew, over a stroller. What did the two incidents have in common? Both stories went viral after passengers’ videos showcased the rotten conditions of flying in coach today. But also, in both cases, it’s not particularly clear that the airline employees caught on camera had many better options.
On the infamous United flight, employees, following protocol, had to call security agents to remove a passenger in Chicago, due to a last-minute need to transport crew to fly out of Louisville the following day. United’s contract of carriage gives employees broad latitude to deny boarding to passengers. On the other hand, it is terrible to force a sitting passenger to get up and de-board a plane. So, the attendants were stuck: Either four people already seated had to leave the plane, or a flight scheduled the next day would have been grounded due to the lack of crew—which would have punished even more paying customers.
Those who speak Toki Pona say linguistic simplicity can enable a more profound form of communication.
In Chinese, the word computer translates directly as electric brain.
In Icelandic, a compassis a direction-shower, and a microscope a small-watcher.
In Lakota, horse is literally dog of wonder.
These neologisms demonstrate the cumulative quality of language, in which we use the known to describe the unknown.
“It is by metaphor that language grows,” writes the psychologist Julian Jaynes. “The common reply to the question ‘What is it?’ is, when the reply is difficult or the experience unique, ‘Well, it is like —.’”
That metaphorical process is at the heart ofToki Pona, the world’s smallest language. While the Oxford English Dictionary contains a quarter of a million entries, and even Koko the gorilla communicates with over 1,000 gestures in American Sign Language, the total vocabulary of Toki Pona is a mere 123 words. Yet, as the creator Sonja Lang and many other Toki Pona speakers insist, it is enough to express almost any idea. This economy of form is accomplished by reducing symbolic thought to its most basic elements, merging related concepts, and having single words perform multiple functions of speech.
The Hulu show has created a world that’s visually and psychologically unlike anything in film or television.
Call it luck, call it fate, call it the world’s most ridiculous viral marketing campaign, but the first television adaptation of The Handmaid’s Tale is debuting on Wednesday to audiences who are hyper-ready for it. The 1985 speculative fiction work by Margaret Atwood has featured on library waitlists and Amazon’s top 20 for months now—partly in anticipation of the new Hulu show, and partly in response to the strange new landscape that emerged after November 9, wherein women in the millions felt compelled to take to the streets to assert their attachment to reproductive freedom. (When the release date for The Handmaid’s Tale was announced in December, people joked that it would likely be a documentary by the time it arrived on TV screens.)
A lab has successfully gestated premature lambs in artificial wombs. Are humans next?
When babies are born at 24 weeks’ gestation, “it is very clear they are not ready to be here,” says Emily Partridge, a research fellow at the Children’s Hospital of Philadelphia.
Doctors dress the hand-sized beings in miniature diapers and cradle them in plastic incubators, where they are fed through tubes. In many cases, IV lines deliver sedatives to help them cope with the ventilators strapped to their faces.
Each year, about 30,000 American babies are born this early—considered “critically preterm,” or younger than 26 weeks. Before 24 weeks, only about half survive, and those who live are likely to endure long-term medical complications. “Among those that survive, the challenges are things we all take for granted, like walking, talking, seeing, hearing,” says Kevin Dysart, a neonatologist at the Children’s Hospital.
The early results out of a Boston nonprofit are positive.
You saw the pictures in science class—a profile view of the human brain, sectioned by function. The piece at the very front, right behind where a forehead would be if the brain were actually in someone’s head, is the pre-frontal cortex. It handles problem-solving, goal-setting, and task execution. And it works with the limbic system, which is connected and sits closer to the center of the brain. The limbic system processes emotions and triggers emotional responses, in part because of its storage of long-term memory.
When a person lives in poverty, a growing body of research suggests the limbic system is constantly sending fear and stress messages to the prefrontal cortex, which overloads its ability to solve problems, set goals, and complete tasks in the most efficient ways.
From Anaïs to Zizek, a brief list of "shibboleth names"
In October 1937, the president of the Dominican Republic, Rafael Trujillo, devised a simple way to identify the Haitian immigrants living along the border of his country. Dominican soldiers would hold up a sprig of parsley—perejil in Spanish—and ask people to identify it. Those who spoke Spanish would pronounce the word's central "r" with that language's characteristic trill; the Haitians, on the other hand, would bury the "r" sound in the throaty way of the French. To be on the receiving end of the parsley test would be to seal, either way, one's fate: The Spanish-speaking Dominicans were left to live, and the Haitians were slaughtered. It was a state-sponsored genocide that would be remembered, in one of history's greatest understatements, as the Parsley Massacre.