It is safe to say that Paul Krugman is much smarter than I am, and that he understands more economics than I do. He generates a great deal of incisive analysis about the economy, and has often had a gift for stabbing straight through to the one underlying piece of data that gives lie to an otherwise plausible economic theory.
I want to get that out of the way, because otherwise my readers (left and right) might assume that this post is a "libertarian economics blogger makes fun of liberal economist's poor reasoning skills" special, and that's not at all why I'm writing it. Paul Krugman is a brilliant and interesting analyst. He also, like everyone else, can be wrong.
There's an interesting phenomenon that often happens when I blog something critical of Paul Krugman: some of his bigger fans turn up in my comments to argue that I am not worthy to talk, because Paul Krugman is a brilliant insightful analyst who has forgotten more economics than I will ever learn--all undoubtedly true. Over and over, they say, Paul Krugman gets it right when other commentators get it wrong. And as proof of this rare perpicacity, they offer the fact that . . . Paul Krugman called the housing bubble in May 2005.
There is rich irony in the belief that Paul Krugman must be right, and I must be wrong, because he had the foresight to call the housing bubble. That's because I saw it in 2002. As you can see, I blogged quite a bit about it before Paul Krugman wrote his first column on the topic. Neither of us, as far as I can tell, understood what that meant for the financial system. But both of us saw it coming, me a little sooner.
This is not that surprising, actually. Lots of people saw it coming. You hear people asking a lot where the financial journalists were--how they could have missed the housing bubble--and the answer is that they didn't! The Economist was writing about it even before I did, thanks to Pam Woodall, the brilliant economics editor who really may have been the first commentator to identify the global phenomenon. Housing bubble stories and op-eds regularly appeared in newspapers like, well, The New York Times. But most people weren't reading the financial press (or this blog) in 2005, and so when they discover that Paul Krugman was writing about the housing bubble way back then, it seems like amazing foresight.
Meanwhile, today I stumbled across another example of Paul Krugman's "foresight", via David Henderson. Chris Alden, a co-founder of Red Herring, blogs about an article Krugman wrote for them back in the 1990s:
He went on to make some specific predictions, all of which were either mostly or completely wrong:
"Productivity will drop sharply this year."
Nope - didn't happen. In fact productivity continued to improve, as this chart shows:
"Inflation will be back. ...In 1999 inflation will probably be more than 3 percent; with only moderate bad luck--say, a drop in the dollar--it could easily top 4 percent."
"Within two or three years, the current mood of American triumphalism--our belief that we have pulled economically and technologically ahead of the rest of the world--will evaporate."
Nope -- that didn't happen, either. Though September 11th, which happened more than three years after this article, and the Lehman Brother's collapse, which happened more than 10 years after this article was written, have certainly reduced American triumphalism. Here is where I think Krugman may have been the most right, albeit it way too early.
"The growth of the Internet will slow drastically, as the flaw in 'Metcalfe's law'--which states that the number of potential connections in a network is proportional to the square of the number of participants--becomes apparent: most people have nothing to say to each other!
By 2005 or so, it will become clear that the Internet's impact on the economy has been no greater than the fax machine's."
"As the rate of technological change in computing slows, the number of jobs for IT specialists will decelerate, then actually turn down; ten years from now, the phrase information economy will sound silly."
"Sometime in the next 20 years, maybe sooner, there will be another '70s-style raw-material crunch: a disruption of oil supplies, a sharp run-up in agricultural prices, or both."
Meh. While have seen oil prices spike (although they have yet to reach the annual peak we saw in 1980), this was not due to a crunch or disruption or running out of oil) but rather growth in demand.
I'm inclined to be more charitable than Alden on a couple of these, but there's no question that Krugman got some things really, really wrong.
But it doesn't follow that Krugman is an idiot who should get no respect--any more than calling the housing bubble made him an infallible genius. Krugman remains a giant intellect who is well worth reading on virtually any economic topic. He is also capable of being badly wrong about things.
You often hear people complain that pundits or analysts aren't punished for getting things wrong. But this is why they aren't: everyone gets things wrong. The question "How can you expect us to listen to Pundit Y when he got everything wrong, and our guy called things correctly" only reveals that the person asking it has managed to forget all the blunders "our guy" made.
What pundits give you is not a perfect map of the future--the only people who succeed in that are characters in historical novels written by an author who already knows what happened. What's important is their thought process--do they point you to arguments you hadn't considered? Do they find data you ought to know about? Do they force you to challenge your own decisions?
Paul Krugman succeeds on that score, even if his crystal ball is a little cloudy.
Is there anything inherently “doggy” about the word “dog”? Obviously not—to the French, a dog is a chien, to Russians a sobaka, to Mandarin Chinese-speakers a gǒu. These words have nothing in common, and none seem any more connected to the canine essence than any other. One runs up against that wall with pretty much any word.
Except some. The word for “mother” seems often either to be mama or have a nasal sound similar to m, like nana. The word for “father” seems often either to be papa or have a sound similar to p, like b, in it—such that you get something like baba. The word for “dad” may also have either d or t, which is a variation on saying d, just as p is on b. People say mama or nana, and then papa, baba, dada, or tata,worldwide.
Before it became the New World, the Western Hemisphere was vastly more populous and sophisticated than has been thought—an altogether more salubrious place to live at the time than, say, Europe. New evidence of both the extent of the population and its agricultural advancement leads to a remarkable conjecture: the Amazon rain forest may be largely a human artifact
The plane took off in weather that was surprisingly cool for north-central Bolivia and flew east, toward the Brazilian border. In a few minutes the roads and houses disappeared, and the only evidence of human settlement was the cattle scattered over the savannah like jimmies on ice cream. Then they, too, disappeared. By that time the archaeologists had their cameras out and were clicking away in delight.
Below us was the Beni, a Bolivian province about the size of Illinois and Indiana put together, and nearly as flat. For almost half the year rain and snowmelt from the mountains to the south and west cover the land with an irregular, slowly moving skin of water that eventually ends up in the province's northern rivers, which are sub-subtributaries of the Amazon. The rest of the year the water dries up and the bright-green vastness turns into something that resembles a desert. This peculiar, remote, watery plain was what had drawn the researchers' attention, and not just because it was one of the few places on earth inhabited by people who might never have seen Westerners with cameras.
Science says lasting relationships come down to—you guessed it—kindness and generosity.
Every day in June, the most popular wedding month of the year, about 13,000 American couples will say “I do,” committing to a lifelong relationship that will be full of friendship, joy, and love that will carry them forward to their final days on this earth.
Except, of course, it doesn’t work out that way for most people. The majority of marriages fail, either ending in divorce and separation or devolving into bitterness and dysfunction. Of all the people who get married, only three in ten remain in healthy, happy marriages, as psychologist Ty Tashiro points out in his book The Science of Happily Ever After, which was published earlier this year.
Social scientists first started studying marriages by observing them in action in the 1970s in response to a crisis: Married couples were divorcing at unprecedented rates. Worried about the impact these divorces would have on the children of the broken marriages, psychologists decided to cast their scientific net on couples, bringing them into the lab to observe them and determine what the ingredients of a healthy, lasting relationship were. Was each unhappy family unhappy in its own way, as Tolstoy claimed, or did the miserable marriages all share something toxic in common?
No defensible moral framework regards foreigners as less deserving of rights than people born in the right place at the right time.
To paraphrase Rousseau, man is born free, yet everywhere he is caged. Barbed-wire, concrete walls, and gun-toting guards confine people to the nation-state of their birth. But why? The argument for open borders is both economic and moral. All people should be free to move about the earth, uncaged by the arbitrary lines known as borders.
Not every place in the world is equally well-suited to mass economic activity. Nature’s bounty is divided unevenly. Variations in wealth and income created by these differences are magnified by governments that suppress entrepreneurship and promote religious intolerance, gender discrimination, or other bigotry. Closed borders compound these injustices, cementing inequality into place and sentencing their victims to a life of penury.
The standard conception of the disorder is based on studies of "hyperactive young white boys." For females, it comes on later, and has different symptoms.
When you live in total squalor—cookies in your pants drawer, pants in your cookies drawer, and nickels, dresses, old New Yorkers, and apple seeds in your bed—it’s hard to know where to look when you lose your keys. The other day, after two weeks of fruitless searching, I found my keys in the refrigerator on top of the roasted garlic hummus. I can’t say I was surprised. I was surprised when my psychiatrist diagnosed me with ADHD two years ago, when I was a junior at Yale.
In editorials and in waiting rooms, concerns of too-liberal diagnoses and over-medication dominate our discussions of attention deficit hyperactivity disorder, or ADHD. The New York Timesrecently reported, with great alarm, the findings of a new Centers for Disease Control and Prevention study: 11 percent of school-age children have received an ADHD diagnosis, a 16 percent increase since 2007. And rising diagnoses mean rising treatments—drugs like Adderall and Ritalin are more accessible than ever, whether prescribed by a physician or purchased in a library. The consequences of misuse and abuse of these drugs are dangerous, sometimes fatal.
Even in big cities like Tokyo, small children take the subway and run errands by themselves. The reason has a lot to do with group dynamics.
It’s a common sight on Japanese mass transit: Children troop through train cars, singly or in small groups, looking for seats.
They wear knee socks, polished patent-leather shoes, and plaid jumpers, with wide-brimmed hats fastened under the chin and train passes pinned to their backpacks. The kids are as young as 6 or 7, on their way to and from school, and there is nary a guardian in sight.
A popular television show called Hajimete no Otsukai, or My First Errand, features children as young as two or three being sent out to do a task for their family. As they tentatively make their way to the greengrocer or bakery, their progress is secretly filmed by a camera crew. The show has been running for more than 25 years.
The Islamic State has made enemies of most of the world. So how is it still winning?
Nearly two millennia ago, the Romans built the Arch of Triumph in Palmyra, Syria. According to Picturesque Palestine, Sinai, and Egypt, published in 1881, “The wonder in these ancient ruins is not that so much has fallen, but that anything remains.” Last week, ISIS blew the Arch of Triumph, which the group considers idolatrous, to pieces. Such acts of aggression and barbarism have mobilized a vast enemy coalition, which includes almost every regional power and virtually every great power (and notably the United States, often compared to the Roman Empire in its hegemonic strength). Yet, incredibly, this alliance seems incapable of rolling back the Islamic State. How can a group of insurgents declare war on humanity—and win?
American politicians are now eager to disown a failed criminal-justice system that’s left the U.S. with the largest incarcerated population in the world. But they've failed to reckon with history. Fifty years after Daniel Patrick Moynihan’s report “The Negro Family” tragically helped create this system, it's time to reclaim his original intent.
By his own lights, Daniel Patrick Moynihan, ambassador, senator, sociologist, and itinerant American intellectual, was the product of a broken home and a pathological family. He was born in 1927 in Tulsa, Oklahoma, but raised mostly in New York City. When Moynihan was 10 years old, his father, John, left the family, plunging it into poverty. Moynihan’s mother, Margaret, remarried, had another child, divorced, moved to Indiana to stay with relatives, then returned to New York, where she worked as a nurse. Moynihan’s childhood—a tangle of poverty, remarriage, relocation, and single motherhood—contrasted starkly with the idyllic American family life he would later extol.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.