It is safe to say that Paul Krugman is much smarter than I am, and that he understands more economics than I do. He generates a great deal of incisive analysis about the economy, and has often had a gift for stabbing straight through to the one underlying piece of data that gives lie to an otherwise plausible economic theory.
I want to get that out of the way, because otherwise my readers (left and right) might assume that this post is a "libertarian economics blogger makes fun of liberal economist's poor reasoning skills" special, and that's not at all why I'm writing it. Paul Krugman is a brilliant and interesting analyst. He also, like everyone else, can be wrong.
There's an interesting phenomenon that often happens when I blog something critical of Paul Krugman: some of his bigger fans turn up in my comments to argue that I am not worthy to talk, because Paul Krugman is a brilliant insightful analyst who has forgotten more economics than I will ever learn--all undoubtedly true. Over and over, they say, Paul Krugman gets it right when other commentators get it wrong. And as proof of this rare perpicacity, they offer the fact that . . . Paul Krugman called the housing bubble in May 2005.
There is rich irony in the belief that Paul Krugman must be right, and I must be wrong, because he had the foresight to call the housing bubble. That's because I saw it in 2002. As you can see, I blogged quite a bit about it before Paul Krugman wrote his first column on the topic. Neither of us, as far as I can tell, understood what that meant for the financial system. But both of us saw it coming, me a little sooner.
This is not that surprising, actually. Lots of people saw it coming. You hear people asking a lot where the financial journalists were--how they could have missed the housing bubble--and the answer is that they didn't! The Economist was writing about it even before I did, thanks to Pam Woodall, the brilliant economics editor who really may have been the first commentator to identify the global phenomenon. Housing bubble stories and op-eds regularly appeared in newspapers like, well, The New York Times. But most people weren't reading the financial press (or this blog) in 2005, and so when they discover that Paul Krugman was writing about the housing bubble way back then, it seems like amazing foresight.
Meanwhile, today I stumbled across another example of Paul Krugman's "foresight", via David Henderson. Chris Alden, a co-founder of Red Herring, blogs about an article Krugman wrote for them back in the 1990s:
He went on to make some specific predictions, all of which were either mostly or completely wrong:
"Productivity will drop sharply this year."
Nope - didn't happen. In fact productivity continued to improve, as this chart shows:
"Inflation will be back. ...In 1999 inflation will probably be more than 3 percent; with only moderate bad luck--say, a drop in the dollar--it could easily top 4 percent."
"Within two or three years, the current mood of American triumphalism--our belief that we have pulled economically and technologically ahead of the rest of the world--will evaporate."
Nope -- that didn't happen, either. Though September 11th, which happened more than three years after this article, and the Lehman Brother's collapse, which happened more than 10 years after this article was written, have certainly reduced American triumphalism. Here is where I think Krugman may have been the most right, albeit it way too early.
"The growth of the Internet will slow drastically, as the flaw in 'Metcalfe's law'--which states that the number of potential connections in a network is proportional to the square of the number of participants--becomes apparent: most people have nothing to say to each other!
By 2005 or so, it will become clear that the Internet's impact on the economy has been no greater than the fax machine's."
"As the rate of technological change in computing slows, the number of jobs for IT specialists will decelerate, then actually turn down; ten years from now, the phrase information economy will sound silly."
"Sometime in the next 20 years, maybe sooner, there will be another '70s-style raw-material crunch: a disruption of oil supplies, a sharp run-up in agricultural prices, or both."
Meh. While have seen oil prices spike (although they have yet to reach the annual peak we saw in 1980), this was not due to a crunch or disruption or running out of oil) but rather growth in demand.
I'm inclined to be more charitable than Alden on a couple of these, but there's no question that Krugman got some things really, really wrong.
But it doesn't follow that Krugman is an idiot who should get no respect--any more than calling the housing bubble made him an infallible genius. Krugman remains a giant intellect who is well worth reading on virtually any economic topic. He is also capable of being badly wrong about things.
You often hear people complain that pundits or analysts aren't punished for getting things wrong. But this is why they aren't: everyone gets things wrong. The question "How can you expect us to listen to Pundit Y when he got everything wrong, and our guy called things correctly" only reveals that the person asking it has managed to forget all the blunders "our guy" made.
What pundits give you is not a perfect map of the future--the only people who succeed in that are characters in historical novels written by an author who already knows what happened. What's important is their thought process--do they point you to arguments you hadn't considered? Do they find data you ought to know about? Do they force you to challenge your own decisions?
Paul Krugman succeeds on that score, even if his crystal ball is a little cloudy.
The condition has long been considered untreatable. Experts can spot it in a child as young as 3 or 4. But a new clinical approach offers hope.
This is a good day, Samantha tells me: 10 on a scale of 10. We’re sitting in a conference room at the San Marcos Treatment Center, just south of Austin, Texas, a space that has witnessed countless difficult conversations between troubled children, their worried parents, and clinical therapists. But today promises unalloyed joy. Samantha’s mother is visiting from Idaho, as she does every six weeks, which means lunch off campus and an excursion to Target. The girl needs supplies: new jeans, yoga pants, nail polish.
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.
At 11, Samantha is just over 5 feet tall and has wavy black hair and a steady gaze. She flashes a smile when I ask about her favorite subject (history), and grimaces when I ask about her least favorite (math). She seems poised and cheerful, a normal preteen. But when we steer into uncomfortable territory—the events that led her to this juvenile-treatment facility nearly 2,000 miles from her family—Samantha hesitates and looks down at her hands. “I wanted the whole world to myself,” she says. “So I made a whole entire book about how to hurt people.”
She lived with us for 56 years. She raised me and my siblings without pay. I was 11, a typical American kid, before I realized who she was.
The ashes filled a black plastic box about the size of a toaster. It weighed three and a half pounds. I put it in a canvas tote bag and packed it in my suitcase this past July for the transpacific flight to Manila. From there I would travel by car to a rural village. When I arrived, I would hand over all that was left of the woman who had spent 56 years as a slave in my family’s household.
Five years ago, on a boat off the southern coast of Sri Lanka, I met the largest animal that exists or has ever existed.
The blue whale grows up to 110 feet in length. Its heart is the size of a small car. Its major artery is big enough that you could wedge a small child into it (although you probably shouldn’t). It’s an avatar of hugeness. And its size is evident if you ever get to see one up close. From the surface, I couldn’t make out the entire animal—just the top of its head as it exposed its blowhole and took a breath. But then, it dove. As its head tilted downwards, its arching back broke the surface of the water in a graceful roll. And it just kept going, and going, and going. By the time the huge tail finally broke the surface, an unreasonable amount of time had elapsed.
A recent push for diversity has been blamed for weak print sales, but the company’s decades-old business practices are the true culprit.
Marvel Comics has been having a rough time lately. Readers and critics met last year’s Civil War 2—a blockbuster crossover event (and aspiritual tie-in to the year’s big Marvel movie)—with disinterest and scorn. Two years of plummeting print comics sales culminated in a February during which only one series managed to sell over 50,000 copies. Three crossover events designed to pump up excitement came and went with little fanfare, while the lead-up to 2017’s blockbuster crossover Secret Empire—where a fascist Captain America subverts and conquers the United States—sparked such a negative response that the company later put out a statement imploring readers to buy the whole thing before judging it. On March 30, a battered Marvel decided to try and get to the bottom of the problem with a retailer summit—and promptly stuck its foot in its mouth.
The office was, until a few decades ago, the last stronghold of fashion formality. Silicon Valley changed that.
Americans began the 20th century in bustles and bowler hats and ended it in velour sweatsuits and flannel shirts—the most radical shift in dress standards in human history. At the center of this sartorial revolution was business casual, a genre of dress that broke the last bastion of formality—office attire—to redefine the American wardrobe.
Born in Silicon Valley in the early 1980s, business casual consists of khaki pants, sensible shoes, and button-down collared shirts. By the time it was mainstream, in the 1990s, it flummoxed HR managers and employees alike. “Welcome to the confusing world of business casual,” declared a fashion writer for the Chicago Tribune in 1995. With time and some coaching, people caught on. Today, though, the term “business casual” is nearly obsolete for describing the clothing of a workforce that includes many who work from home in yoga pants, put on a clean T-shirt for a Skype meeting, and don’t always go into the office.
Unexpected discoveries in the quest to cure an extraordinary skeletal condition show how medically relevant rare diseases can be.
When Jeannie Peeper was born in 1958, there was only one thing amiss: her big toes were short and crooked. Doctors fitted her with toe braces and sent her home. Two months later, a bulbous swelling appeared on the back of Peeper’s head. Her parents didn’t know why: she hadn’t hit her head on the side of her crib; she didn’t have an infected scratch. After a few days, the swelling vanished as quickly as it had arrived.
When Peeper’s mother noticed that the baby couldn’t open her mouth as wide as her sisters and brothers, she took her to the first of various doctors, seeking an explanation for her seemingly random assortment of symptoms. Peeper was 4 when the Mayo Clinic confirmed a diagnosis: she had a disorder known as fibrodysplasia ossificans progressiva (FOP).
Several studies show beneficiaries of the program are more likely to be obese. But the answer is not to cut benefits, some academics say.
Among other programs President Trump proposed slashing in his budget blueprint Tuesday, the Supplemental Nutrition Assistance Program, previously known as the food stamps program, would lose 29 percent of its funding over 10 years.
Conservative groups praised the budget proposal’s combination of boosted defense spending and cuts to “domestic programs that are redundant, improper, or otherwise wasteful,” as Romina Boccia, a fellow in federal budgetary affairs at the Heritage Foundation, said in a statement. Liberal groups, meanwhile, said it would “harm America's most vulnerable people and make matters worse for those who can least afford it,” as Felicia Wong, president of the Roosevelt Institute, a progressive think tank, put it.
I bought into the St. Ives lie for years. In the already insecure times of high school and college, my skin was host to constant colonies of acne, my nose peppered with blackheads, my chin and forehead a topographical horror of cystic zits that lasted for weeks. But as I moved into adulthood, it didn’t go away, making me, I suppose, part of a trend—adult acne is on the rise, particularly among women.
I’m sure it never really seemed so bad to others as it did to me, as is the way with these things. I covered it up with layers of gloppy foundation, then with more proficiently applied makeup later on, then went on hormonal birth control, which improved the situation significantly.
But for many of the years in-between, I washed my face with St. Ives Apricot Scrub, which is an exfoliator made with granules of walnut shell powder. It is extremely rough. Perhaps too rough. We’ll find out: Kaylee Browning and Sarah Basile recently filed a class-action lawsuit against St. Ives’s maker, Unilever, alleging that the wash “leads to long-term skin damage” and “is not fit to be sold as a facial scrub.”
The national park wouldn’t let him collect rocks for research.
“How did the Grand Canyon form?” is a question so commonly pondered that YouTube is rife with explanations. Go down into the long tail of Grand Canyon videos, and you’ll eventually find a two-part, 35-minute lecture by Andrew Snelling. The first sign this isn’t a typical geology lecture comes about a minute in, when Snelling proclaims, “The Grand Canyon does provide a testament to the biblical account of Earth’s history.”
Snelling is a prominent young-Earth creationist. For years, he has given lectures, guided biblical-themed Grand Canyon rafting tours, and worked for the nonprofit Answers in Genesis. (The CEO of Answers in Genesis, Ken Ham, is also behind the Creation Museum and the Ark Encounter theme park.) Young-Earth creationism, in contrast to other forms of creationism, specifically holds that the Earth is only thousands of years old. Snelling believes that the Grand Canyon formed after Noah’s flood—and he now claims the U.S. government is blocking his research in the canyon because of his religious views.