It is safe to say that Paul Krugman is much smarter than I am, and that he understands more economics than I do. He generates a great deal of incisive analysis about the economy, and has often had a gift for stabbing straight through to the one underlying piece of data that gives lie to an otherwise plausible economic theory.
I want to get that out of the way, because otherwise my readers (left and right) might assume that this post is a "libertarian economics blogger makes fun of liberal economist's poor reasoning skills" special, and that's not at all why I'm writing it. Paul Krugman is a brilliant and interesting analyst. He also, like everyone else, can be wrong.
There's an interesting phenomenon that often happens when I blog something critical of Paul Krugman: some of his bigger fans turn up in my comments to argue that I am not worthy to talk, because Paul Krugman is a brilliant insightful analyst who has forgotten more economics than I will ever learn--all undoubtedly true. Over and over, they say, Paul Krugman gets it right when other commentators get it wrong. And as proof of this rare perpicacity, they offer the fact that . . . Paul Krugman called the housing bubble in May 2005.
There is rich irony in the belief that Paul Krugman must be right, and I must be wrong, because he had the foresight to call the housing bubble. That's because I saw it in 2002. As you can see, I blogged quite a bit about it before Paul Krugman wrote his first column on the topic. Neither of us, as far as I can tell, understood what that meant for the financial system. But both of us saw it coming, me a little sooner.
This is not that surprising, actually. Lots of people saw it coming. You hear people asking a lot where the financial journalists were--how they could have missed the housing bubble--and the answer is that they didn't! The Economist was writing about it even before I did, thanks to Pam Woodall, the brilliant economics editor who really may have been the first commentator to identify the global phenomenon. Housing bubble stories and op-eds regularly appeared in newspapers like, well, The New York Times. But most people weren't reading the financial press (or this blog) in 2005, and so when they discover that Paul Krugman was writing about the housing bubble way back then, it seems like amazing foresight.
Meanwhile, today I stumbled across another example of Paul Krugman's "foresight", via David Henderson. Chris Alden, a co-founder of Red Herring, blogs about an article Krugman wrote for them back in the 1990s:
He went on to make some specific predictions, all of which were either mostly or completely wrong:
"Productivity will drop sharply this year."
Nope - didn't happen. In fact productivity continued to improve, as this chart shows:
"Inflation will be back. ...In 1999 inflation will probably be more than 3 percent; with only moderate bad luck--say, a drop in the dollar--it could easily top 4 percent."
"Within two or three years, the current mood of American triumphalism--our belief that we have pulled economically and technologically ahead of the rest of the world--will evaporate."
Nope -- that didn't happen, either. Though September 11th, which happened more than three years after this article, and the Lehman Brother's collapse, which happened more than 10 years after this article was written, have certainly reduced American triumphalism. Here is where I think Krugman may have been the most right, albeit it way too early.
"The growth of the Internet will slow drastically, as the flaw in 'Metcalfe's law'--which states that the number of potential connections in a network is proportional to the square of the number of participants--becomes apparent: most people have nothing to say to each other!
By 2005 or so, it will become clear that the Internet's impact on the economy has been no greater than the fax machine's."
"As the rate of technological change in computing slows, the number of jobs for IT specialists will decelerate, then actually turn down; ten years from now, the phrase information economy will sound silly."
"Sometime in the next 20 years, maybe sooner, there will be another '70s-style raw-material crunch: a disruption of oil supplies, a sharp run-up in agricultural prices, or both."
Meh. While have seen oil prices spike (although they have yet to reach the annual peak we saw in 1980), this was not due to a crunch or disruption or running out of oil) but rather growth in demand.
I'm inclined to be more charitable than Alden on a couple of these, but there's no question that Krugman got some things really, really wrong.
But it doesn't follow that Krugman is an idiot who should get no respect--any more than calling the housing bubble made him an infallible genius. Krugman remains a giant intellect who is well worth reading on virtually any economic topic. He is also capable of being badly wrong about things.
You often hear people complain that pundits or analysts aren't punished for getting things wrong. But this is why they aren't: everyone gets things wrong. The question "How can you expect us to listen to Pundit Y when he got everything wrong, and our guy called things correctly" only reveals that the person asking it has managed to forget all the blunders "our guy" made.
What pundits give you is not a perfect map of the future--the only people who succeed in that are characters in historical novels written by an author who already knows what happened. What's important is their thought process--do they point you to arguments you hadn't considered? Do they find data you ought to know about? Do they force you to challenge your own decisions?
Paul Krugman succeeds on that score, even if his crystal ball is a little cloudy.
According to Arthur, just a few months later, all 60 members of a committee selected by the American Dialect Society voted to google 2002’s most useful new word. Merriam-Webster and the Oxford English Dictionary would soon note the coinage. By 2006, Google’s lawyers—fearful of seeing the company’s name brand watered down to the trademark mushiness of kleenex—wrote a post for the company blog outlining when and when not to google should be used.
From the “400-pound” hacker to Alicia Machado, the candidate’s denigration of fat people has a long tradition—but may be a liability.
One of the odder moments of Monday’s presidential debate came when Donald Trump speculated that the DNC had been hacked not by Russia but by “someone sitting on their bed that weighs 400 pounds.” He was trying to suggest the crime had committed by someone unaffiliated with a government—but why bring up fatness?
Weight seems to be one of Trump’s preoccupations. The debate and its fallout highlighted how he publicly ridiculed the Miss Universe winner Alicia Machado as “Miss Piggy” and an “eating machine,” and how he called Rosie O’Donnell a “fat pig” with “a fat, ugly face” (“I think everyone would agree that she deserves it and nobody feels sorry for her,” he said onstage Monday). He also recently poked fun at his ally Chris Christie’s weight-loss struggles and called out a protestor as “seriously overweight.” And when he was host of The Apprentice, he insisted on keeping a “funny fat guy” on the show, according to one of its producers.
The biggest threat to the Republican nominee is not his poor performance in the debate, but his reaction to it: blaming microphones, insisting he won, and doubling down on gaffes.
Debates seldom make a great deal of difference to the outcome of the election. Mitt Romney’s dominating performance in the first debate four years ago? Didn’t stop Obama’s reelection. Gerald Ford’s “no domination of Eastern Europe” gaffe in 1976? He rose after it.
Sure, it’s better to win than to lose, but the historical record is a good reminder of why Hillary Clinton’s strong performance in Monday’s debate could have a limited effect on the election’s outcome. If it does have a lasting impact, however, it will likely be due not to what happened on stage at Hofstra University, but due to Donald Trump’s hectic, frenetic crisis-communications strategy.
This is a pattern amply seen before in the election: Trump gets caught in a tight spot, and rather de-escalate, he tends to take out the bellows and fan the flames as much as he can. Time and again, he has managed to overtake a news cycle (and often overshadow bad news about Clinton) thanks to bad crisis management. It’s what he did in his tiff with Khizr and Ghazala Khan, and so far it’s his post-debate strategy, too.
In North Carolina, the Democratic candidate basked in her debate victory. As for her supporters, they’re feeling better, but they’re not ready to exhale.
RALEIGH, N.C.— "Did anybody see that debate last night? Ooooh yes," Hillary Clinton said, her first words after striding confidently out on stage at Wake Technical Community College Tuesday afternoon.
As a capacity crowd cheered, she added, "One down, two to go."
Celebration and relief added to the thick humidity of late summerat Clinton’s event inNorth Carolina. Post-debate analysis is in that awkward in-between state, after the pundits have rendered their verdicts and before high-quality polling has measured the nation’s response. But the Democratic nominee seemed sure that she was the victor.
It was Clinton’s first event after the first presidential debate Monday evening in Hempstead, New York. One sign of her confidence coming out of that encounter: As I approached the rally, a man asked for a hand loading a heavy box into his car. He was the teleprompter man, he said, but when he arrived in Raleigh, he’d been told that Clinton had decided to do without the prompter. He was turning around and heading back to Washington, D.C.
In a unique, home-spun experiment, researchers found that centripetal force could help people pass kidney stones—before they become a serious health-care cost.
East Lansing, Michigan, becomes a ghost town during spring break. Families head south, often to the theme parks in Orlando. A week later, the Midwesterners return sunburned and bereft of disposable income, and, urological surgeon David Wartinger noticed, some also come home with fewer kidney stones.
Wartinger is a professor emeritus at Michigan State, where he has dealt for decades with the scourge of kidney stones, which affect around one in 10 people at some point in life. Most are small, and they pass through us without issue. But many linger in our kidneys and grow, sending hundreds of thousands of people to emergency rooms and costing around $3.8 billion every year in treatment and extraction. The pain of passing a larger stone is often compared to child birth.
For decades, the candidate has willfully inflicted pain and humiliation.
Donald J. Trump has a cruel streak. He willfully causes pain and distress to others. And he repeats this public behavior so frequently that it’s fair to call it a character trait. Any single example would be off-putting but forgivable. Being shown many examples across many years should make any decent person recoil in disgust.
Judge for yourself if these examples qualify.
* * *
In national politics, harsh attacks are to be expected. I certainly don’t fault Trump for calling Hillary Clinton dishonest, or wrongheaded, or possessed of bad judgment, even if it’s a jarring departure from the glowing compliments that he used to pay her.
But even in a realm where the harshest critiques are part of the civic process, Trump crossed a line this week when he declared his intention to invite Gennifer Flowers to today’s presidential debate. What kind of man invites a husband’s former mistress to an event to taunt his wife? Trump managed to launch an attack that couldn’t be less relevant to his opponent’s qualifications or more personally cruel. His campaign and his running-mate later said that it was all a big joke. No matter. Whether in earnest or in jest, Trump showed his tendency to humiliate others.
The films touted for consideration this year include prestige projects like Martin Scorsese’s Silence and festival hits like Barry Jenkins’s Moonlight.
With the main film festivals of the fall (Telluride, Venice, and Toronto) now concluded, and Martin Scorsese finally confirming that his much-anticipated drama Silence will come out at the end of the year, the next three months will bring a calendar loaded with prestige releases. Among them are films that better reflect the wide range of faces and voices in America (and around the world), which have recently been severely under-represented on Oscar night. Audiences and critics will be paying especially close attention to the works and actors the Academy chooses to recognize, after the awards were condemned this year for nominating only white performers two years in a row.
The question, as always, is which films will be able to stand out once studios begin their awards campaigns in earnest. A lot can happen in a few months; after all, the season has already seen its earliest anointed front-runner practically disappear from the race. The former Best Picture favorite was the big story out of Sundance: The Birth of a Nation(October 7), a searing depiction of Nat Turner’s 1831 slave rebellion in Virginia written and directed by Nate Parker. The film won the festival’s Grand Jury Prize just as the conversation over the largely white Oscar nominations was at its loudest. The movie was acquired by Fox Searchlight for a record $17.5 million, with the studio promising a huge publicity campaign in the fall to help push it for awards contention.
Programs that should be crafted around people’s needs are instead designed to deal with a problem that doesn’t exist.
At a campaign rally in 1976, Ronald Reagan introduced the welfare queen into the public conversation about poverty: “She used 80 names, 30 addresses, 15 telephone numbers to collect food stamps, Social Security, veterans’ benefits for four nonexistent deceased veteran husbands, as well as welfare. Her tax-free cash income alone has been running $150,000 a year.”
The perception of who benefits from a policy is of material consequence to how it is designed. For the past 40 years, U.S. welfare policy has been designed around Reagan’s mythical welfare queen—with very real consequences for actual families in need of support.
Though it was Reagan who gave her the most salient identity, the welfare queen emerged from a long and deeply racialized history of suspicion of and resentment toward families receiving welfare in the United States. Today, 20 years after welfare reform was enacted, this narrative continues to inform policy design by dictating who is “deserving” of support and under what conditions. Ending the reign of the welfare queen over public policy means recognizing this lineage, identifying how these stereotypes continue to manifest, and reorienting policy design around families as they are—not who they are perceived to be.
Congress voted overwhelmingly to disregard the president’s rejection of legislation allowing 9/11 victims to sue a foreign government in U.S. court.
Updated on September 28 at 4:27 p.m.
For the first time in President Obama’s two terms in the White House, Congress has enacted legislation without his signature.
The House and Senate on Wednesday voted by a wide margin to override Obama’s veto of a bill that would allow victims of the September 11, 2001 attacks to sue a foreign government—namely, Saudi Arabia—in U.S. court, even if it had not been designated a state sponsor of terrorism. The president, in rejecting the measure, had warned that undercutting the principle of “sovereign immunity” could lead to retaliation against U.S. interests abroad, including countries that would try to bring legal action against American soldiers and diplomats overseas.