It is safe to say that Paul Krugman is much smarter than I am, and that he understands more economics than I do. He generates a great deal of incisive analysis about the economy, and has often had a gift for stabbing straight through to the one underlying piece of data that gives lie to an otherwise plausible economic theory.
I want to get that out of the way, because otherwise my readers (left and right) might assume that this post is a "libertarian economics blogger makes fun of liberal economist's poor reasoning skills" special, and that's not at all why I'm writing it. Paul Krugman is a brilliant and interesting analyst. He also, like everyone else, can be wrong.
There's an interesting phenomenon that often happens when I blog something critical of Paul Krugman: some of his bigger fans turn up in my comments to argue that I am not worthy to talk, because Paul Krugman is a brilliant insightful analyst who has forgotten more economics than I will ever learn--all undoubtedly true. Over and over, they say, Paul Krugman gets it right when other commentators get it wrong. And as proof of this rare perpicacity, they offer the fact that . . . Paul Krugman called the housing bubble in May 2005.
There is rich irony in the belief that Paul Krugman must be right, and I must be wrong, because he had the foresight to call the housing bubble. That's because I saw it in 2002. As you can see, I blogged quite a bit about it before Paul Krugman wrote his first column on the topic. Neither of us, as far as I can tell, understood what that meant for the financial system. But both of us saw it coming, me a little sooner.
This is not that surprising, actually. Lots of people saw it coming. You hear people asking a lot where the financial journalists were--how they could have missed the housing bubble--and the answer is that they didn't! The Economist was writing about it even before I did, thanks to Pam Woodall, the brilliant economics editor who really may have been the first commentator to identify the global phenomenon. Housing bubble stories and op-eds regularly appeared in newspapers like, well, The New York Times. But most people weren't reading the financial press (or this blog) in 2005, and so when they discover that Paul Krugman was writing about the housing bubble way back then, it seems like amazing foresight.
Meanwhile, today I stumbled across another example of Paul Krugman's "foresight", via David Henderson. Chris Alden, a co-founder of Red Herring, blogs about an article Krugman wrote for them back in the 1990s:
He went on to make some specific predictions, all of which were either mostly or completely wrong:
"Productivity will drop sharply this year."
Nope - didn't happen. In fact productivity continued to improve, as this chart shows:
"Inflation will be back. ...In 1999 inflation will probably be more than 3 percent; with only moderate bad luck--say, a drop in the dollar--it could easily top 4 percent."
"Within two or three years, the current mood of American triumphalism--our belief that we have pulled economically and technologically ahead of the rest of the world--will evaporate."
Nope -- that didn't happen, either. Though September 11th, which happened more than three years after this article, and the Lehman Brother's collapse, which happened more than 10 years after this article was written, have certainly reduced American triumphalism. Here is where I think Krugman may have been the most right, albeit it way too early.
"The growth of the Internet will slow drastically, as the flaw in 'Metcalfe's law'--which states that the number of potential connections in a network is proportional to the square of the number of participants--becomes apparent: most people have nothing to say to each other!
By 2005 or so, it will become clear that the Internet's impact on the economy has been no greater than the fax machine's."
"As the rate of technological change in computing slows, the number of jobs for IT specialists will decelerate, then actually turn down; ten years from now, the phrase information economy will sound silly."
"Sometime in the next 20 years, maybe sooner, there will be another '70s-style raw-material crunch: a disruption of oil supplies, a sharp run-up in agricultural prices, or both."
Meh. While have seen oil prices spike (although they have yet to reach the annual peak we saw in 1980), this was not due to a crunch or disruption or running out of oil) but rather growth in demand.
I'm inclined to be more charitable than Alden on a couple of these, but there's no question that Krugman got some things really, really wrong.
But it doesn't follow that Krugman is an idiot who should get no respect--any more than calling the housing bubble made him an infallible genius. Krugman remains a giant intellect who is well worth reading on virtually any economic topic. He is also capable of being badly wrong about things.
You often hear people complain that pundits or analysts aren't punished for getting things wrong. But this is why they aren't: everyone gets things wrong. The question "How can you expect us to listen to Pundit Y when he got everything wrong, and our guy called things correctly" only reveals that the person asking it has managed to forget all the blunders "our guy" made.
What pundits give you is not a perfect map of the future--the only people who succeed in that are characters in historical novels written by an author who already knows what happened. What's important is their thought process--do they point you to arguments you hadn't considered? Do they find data you ought to know about? Do they force you to challenge your own decisions?
Paul Krugman succeeds on that score, even if his crystal ball is a little cloudy.
On “Back to Back Freestyle” and “Charged Up,” the rapper forgoes the high road in his beef with Meek Mill.
Once upon a time, Drake made a vow of silence. “Diss me, you'll never hear a reply for it,” he said on “Successful,” the 2009 song in which the Toronto rapper correctly predicted he’d soon be superwealthy. This week, Drake has broken his vow twice over, a fact about which he seems conflicted. “When I look back,” he says on the new track “Back to Back Freestyle,” “I might be mad that I gave this attention.”
“This” is the beef started by the 28-year-old Philadelphia rapper Meek Mill, who recently tweeted accusations that Drake doesn’t write his own material. Depending on who you talk to or how you look at it, this is either a big deal or no deal at all. On Instagram, Lupe Fiasco had a good take: “Ghostwriting, or borrowing lines, or taking suggestions from the room has always been in rap and will always be in rap. It is nothing to go crazy over or be offended about unless you are someone who postures him or herself on the importance of authenticity and tries to portray that quality to your fans or the public at large. Then we might have a problem.”
Even when they’re adopted, the children of the wealthy grow up to be just as well-off as their parents.
Lately, it seems that every new study about social mobility further corrodes the story Americans tell themselves about meritocracy; each one provides more evidence that comfortable lives are reserved for the winners of what sociologists call the birth lottery. But, recently, there have been suggestions that the birth lottery’s outcomes can be manipulated even after the fluttering ping-pong balls of inequality have been drawn.
What appears to matter—a lot—is environment, and that’s something that can be controlled. For example, one study out of Harvard found that moving poor families into better neighborhoods greatly increased the chances that children would escape poverty when they grew up.
While it’s well documentedthat the children of the wealthy tend to grow up to be wealthy, researchers are still at work on how and why that happens. Perhaps they grow up to be rich because they genetically inherit certain skills and preferences, such as a tendency to tuck away money into savings. Or perhaps it’s mostly because wealthier parents invest more in their children’s education and help them get well-paid jobs. Is it more nature, or more nurture?
The Vermont senator’s revolutionary zeal has met its moment.
There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!
And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.
He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.
Today's cities may be more diverse overall, but people of different races still don’t live near each other.
Nearly 50 years ago, after a string of race-related riots in cities across America, President Lyndon B. Johnson commissioned a panel of civic leaders to investigate the underlying causes of racial tension in the country.
The result was the Kerner Report, a document that castigated white society for fleeing to suburbs, where they excluded blacks from employment, housing, and educational opportunities. The report’s famous conclusion: “Our nation is moving toward two societies, one black, one white—separate and unequal.”
Much of America would like to believe the nation has changed since then. The election of a black President was said to usher in a “post-racial era.” Cheerios commercials nowfeature interracial couples. As both suburbs and cities grew more diverse, more than one academic study trumpeted theend of segregation in American neighborhoods.
Samuel DuBose’s death at the hands of a university police officer points to problems with piecemeal approaches to reform.
During a news conference Wednesday, discussing the killing of Samuel DuBose, Hamilton County, Ohio, prosecutor Joe Deters said several remarkable things.
“This is without question a murder,” he said, adding that Ray Tensing, who killed Dubose—an unarmed black man pulled over for a missing front license plate—“should never have been a police officer.” Deters said, “This is the most asinine act I’ve ever seen a police officer make.”
Amid a string of cases where police have killed black men, what makes this case different, as Robinson Meyer notes, is body-cam footage that captured the incident, and helped bring about Tensing’s indictment for murder. But the case is also interesting because Tensing wasn't a Cincinnati police officer. He was employed by the police department of the University of Cincinnati—a fact the prosecutor lamented.
During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.
During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.
Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.
The authors in the running for Britain's most prestigious literary award come from seven countries and include seven women writers.
The longlist for the Man Booker Prize, one of the most prestigious literary awards, was announced Wednesday. For the second year, the prize was open to writers of any nationality who publish books in English in the U.K., and this year five American writers made the list of 13 contenders, chosen by five judges from a pool of 156 total works.
The U.S. is, in fact, the most well-represented country, with other entrants hailing from Great Britain, Jamaica, New Zealand, Nigeria, Ireland, and India. There are three debut novelists and one former winner on the list, and women writers outnumber men seven to six. From dystopian and political novels to a multitude of iterations on the family drama, the selections capture the ever-changing human experience in very different ways.
“If nobody respected the Taliban leadership anymore,” said one analyst, “then you have no one to talk to.”
Reports on Wednesday that reclusive Taliban leader Mullah Omar had died will be rightly hailed by some as the demise of an American nemesis. But the death of the one-eyed Afghan commander may also scuttle the most promising peace talks in Afghanistan in a decade.
Omar’s direct role in day-to-day Taliban operations had been declining for years, according to Western diplomats in Afghanistan. Even if he is alive, the former leader of Afghanistan is believed to be severely ill.
But the myth that surrounds Omar is a key element in determining whether peace talks can succeed. With the Islamic State and other jihadist groups vying for the loyalty of young Taliban fighters, it is unclear whether any leader except Omar can hold the movement together and then get its members to accept a peace settlement.
His press conference announcing murder charges had just one flaw: He understated how often police officers shoot unarmed people in traffic stops.
On Wednesday, as officials in Hamilton County, Ohio, released video footage of University of Cincinnati Police Officer Ray Tensing shooting unarmed motorist Samuel DuBose in the head during a traffic stop, prosecutor Joe Deters conducted himself as professionally and appropriately as any prosecutor I’ve ever seen in a similar situation.
The 30-year veteran, who announced that officer Tensing was being indicted for murder, took immediate care to affirmatively state that the victim in the case was not responsible for his fate. “This is the most asinine act I’ve ever seen a police officer make,” he told reporters. “People want to believe that Mr. DuBose had done something violent toward the officer; he did not. He did not at all. And I feel so sorry for his family and what they lost. And I feel sorry for the community, too.”
I agree: It’s why I wrote about how poorly iTunes performs for classical music listeners and, really, for anyone with a large music library.
But it’s worth spending time on iTunes’s specific design problems, which surpass those raised by managing a music library or listening to a specific genre. Toxic hellstew it may be, a new version of iTunes points at what kinds of technology are allowed to come out of Apple. Apple is the most valuable company in the world and an organization hailed for its good design. Why does iTunes fail at what it sets out to do?