Ten years ago today the U.S. began its invasion of Iraq. I argue that it was the worst strategic mistake since the end of World War II, and probably the biggest "unforced error" in American history.
Even as I've been ladling out the 10-years-after installments, I have very little faith or even hope that this ruinous decision will prove "instructive" in any way. Here is why:
1) Avoidance. After Pearl Harbor, after Vietnam, after World War II, after the 9/11 attacks, even after civilian disasters like the Challenger explosion or Katrina, there were official efforts, of varying seriousness and success, to find out what had gone wrong, and why, and to yield "lessons learned."
'Like infants, they live in a continuous present'
That hasn't happened this time, for a lot of reasons. For the Bush Administration, there was no "failure" to be examined and explained. For the Obama Administration, the point was to "look forward not back."
People in the media and politics who were against the war know that it can grow tiresome to keep pointing that out. Example: Barack Obama would not be president today if he had not given a speech in Chicago in October 2002, saying that he (as a mere state senator) did not oppose all wars but was against a "dumb" and "rash" war in Iraq. Listen to how he talked in those days! He denounced "the cynical attempt by Richard Perle and Paul Wolfowitz and other armchair, weekend warriors in this administration to shove their own ideological agendas down our throats, irrespective of the costs in lives lost and in hardships borne." Because of that speech, six years later Obama could argue that his judgment had been right, and the vastly more experienced HIllary Clinton's had been wrong, about matters of war and peace. But there's no percentage for him in bringing that up now.
People in the media who were for the war have, with rare and admirable exceptions, avoided looking back. The Washington Post's editorial page was one of the most strident pro-war voices, part of a claque creating -- as I recall and noted at the time -- a kind of war frenzy in the capital. There is not a word about Iraq on its editorial page today (at right, but check it out for yourself). Say this for Paul Wolfowitz: While he didn't come close on this past week's talk shows to engaging Andrew Bacevich's challenge [which Harper'shas now opened for non-subscribers], at least he recognized Iraq as a question he would have to address. George Packer was one of several influential "liberal hawks" who were making a pro-war case in the New Yorker. I view, and viewed, that era and its choices very differently from him. (For instance he now says, "Spending a lot of time in Iraq did not make you" -- meaning himself -- "more keenly aware of America's larger strategic interests. It rendered you less likely to ask the essential questions about the inception of the war.") But I am glad he addresses the issue today.
2) The 'continuous present' Our friend Mike Lofgren argues in the Huffington Post that all factions in politics and the media have not simply "failed" to learn. They live in a system that rewards not learning. For instance, he says:
Aside from its inordinate fiscal and human cost, deposing Saddam Hussein and installing a Shia-led government has had the effect of strengthening the regional position of Iran. But having built up the Iranian bogey through its own stupidity, the U.S. political establishment is now contemplating how to coerce Teheran. This refusal to see the consequences of one's actions, and then using the disastrous result as an excuse to do the same thing again, is a recurring pattern of American statecraft.
One can hypothesize that our leaders see world events as discrete and unconnected with anything that happened before; like infants, they live in a continuous present.
3) The recurring pattern of error. When politicians and the media were "wrong" about Iraq, what did wrongness entail? Reduced to its essence it meant:
Exaggerating the scale and imminence of a threat from Iraq;
Growing testily impatient with any solutions other than the "kinetic" (e.g., from TNY 10 years ago, "a return to a hollow pursuit of containment will be the most dangerous option of all.");
Grossly underestimating the difficulty of "removing" that threat with military force;
Showing a failure of tragic imagination (different from a tragic failure of imagination, which was also true) about the ripple effects and long-term costs and consequences of taking a clear and "decisive" step now.
If we were to "learn" from mistakes, we might avoid this specific set of biases and miscalibrations when it comes to another "preventive" strike against another threatening nation in exactly the same part of the world. But we see every one of these four elements of this syndrome -- exaggeration, impatience, polyanna-ism about military measures, naivete about long-term effects - in discussions about the "need" and "moral duty" to condone military action against Iran.
Of course Iran and Iraq are different; the challenges are different; the details of military action are different. But the similarities are even greater -- and whether we can bear them in mind as we contemplate the "next war" will say a lot about whether it is ever possible to learn.
James Fallows is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. He has reported extensively from outside the United States and once worked as President Carter's chief speechwriter. His latest book is China Airborne.
What would the American culture wars look like if they were less about “values” and more about Jesus?
Evangelical Christianity has long had a stranglehold on how Americans imagine public faith. Vague invocations of “religion”—whether it’s “religion vs. science” or “religious freedom”—usually really mean “conservative, Protestant, evangelical Christianity,” and this assumption inevitably frames debates about American belief. For the other three-quarters of the population—Catholics, Jews, other Protestants, Muslims, Hindus, secular Americans, Buddhists, Wiccans, etc.—this can be infuriating. For some evangelicals, it’s a sign of success, a linguistic triumph of the culture wars.
But not for Russell Moore. In 2013, the 43-year-old theologian became the head of the Ethics and Religious Liberty Commission, the political nerve center of the Southern Baptist Convention. His predecessor, Richard Land, prayed with George W. Bush, played hardball with Democrats, and helped make evangelicals a quintessentially Republican voting bloc.
The winners of the 27th annual National Geographic Traveler Photo Contest have just been announced.
The winners of the 27th annual National Geographic Traveler Photo Contest have just been announced. Winning first prize, Anuar Patjane Floriuk of Tehuacán, Mexico, will receive an eight-day photo expedition for two to Costa Rica and the Panama Canal for a photograph of divers swimming near a humpback whale off the western coast of Mexico. Here, National Geographic has shared all of this year’s winners, gathered from four categories: Travel Portraits, Outdoor Scenes, Sense of Place, and Spontaneous Moments. Captions by the photographers.
Many psychiatrists believe that a new approach to diagnosing and treating depression—linking individual symptoms to their underlying mechanisms—is needed for research to move forward.
In his Aphorisms, Hippocrates defined melancholia, an early understanding of depression, as a state of “fears and despondencies, if they last a long time.” It was caused, he believed, by an excess of bile in the body (the word “melancholia” is ancient Greek for “black bile”).
Ever since then, doctors have struggled to create a more precise and accurate definition of the illness that still isn’t well understood. In the 1920s, the German psychiatrist Kurt Schneider argued that depression could be divided into two separate conditions, each requiring a different form of treatment: depression that resulted from changes in mood, which he called “endogenous depression,” and depression resulting from reactions to outside events, or “reactive depression.” His theory was challenged in 1926, when the British psychologist Edward Mapother argued in the British Medical Journal that there was no evidence for two distinct types of depression, and that the apparent differences between depression patients were just differences in the severity of the condition.
Before it became the New World, the Western Hemisphere was vastly more populous and sophisticated than has been thought—an altogether more salubrious place to live at the time than, say, Europe. New evidence of both the extent of the population and its agricultural advancement leads to a remarkable conjecture: the Amazon rain forest may be largely a human artifact
The plane took off in weather that was surprisingly cool for north-central Bolivia and flew east, toward the Brazilian border. In a few minutes the roads and houses disappeared, and the only evidence of human settlement was the cattle scattered over the savannah like jimmies on ice cream. Then they, too, disappeared. By that time the archaeologists had their cameras out and were clicking away in delight.
Below us was the Beni, a Bolivian province about the size of Illinois and Indiana put together, and nearly as flat. For almost half the year rain and snowmelt from the mountains to the south and west cover the land with an irregular, slowly moving skin of water that eventually ends up in the province's northern rivers, which are sub-subtributaries of the Amazon. The rest of the year the water dries up and the bright-green vastness turns into something that resembles a desert. This peculiar, remote, watery plain was what had drawn the researchers' attention, and not just because it was one of the few places on earth inhabited by people who might never have seen Westerners with cameras.
Paul faced danger, Ani and Ray faced each other, and Frank faced some career decisions.
This is what happens when you devote two-thirds of a season to scene after scene after scene of Frank and Jordan’s Baby Problems, and Frank Shaking Guys Down, and Look How Fucked Up Ray and Ani Are, and Melancholy Singer in the Dive Bar Yet Again—and then you suddenly realize that with only a couple episodes left you haven’t offered even a rudimentary outline of the central plot.
What if Joe Biden is going to run for the Democratic nomination after all?
Most Democrats seem ready for Hillary Clinton—or at least appear content with her candidacy. But what about the ones who who were bidin’ for Biden? There are new signs the vice president might consider running for president after all.
Biden has given little indication he was exploring a run: There’s no super PAC, no cultivation of a network of fundraisers or grassroots organizers, few visits to early-primary states. While his boss hasn’t endorsed Clinton—and says he won’t endorse in the primary—many members of the Obama administration have gone to work for Clinton, including some close to Biden.
But Biden also hasn’t given any clear indication that he isn’t running, and a column by Maureen Dowd in Saturday’s New York Times has set off new speculation. One reason Biden didn’t get into the race was that his son Beau was dying of cancer, and the vice president was focused on being with his son. But before he died in May, Dowd reported, Beau Biden tried to get his father to promise to run. Now Joe Biden is considering the idea.
The jobs that are least vulnerable to automation tend to be held by women.
Many economists and technologists believe the world is on the brink of a new industrial revolution, in which advances in the field of artificial intelligence will obsolete human labor at an unforgiving pace. Two Oxford researchers recently analyzed the skills required for more than 700 different occupations to determine how many of them would be susceptible to automation in the near future, and the news was not good: They concluded that machines are likely to take over 47 percent of today’s jobs within a few decades.
This is a dire prediction, but one whose consequences will not fall upon society evenly. A close look at the data reveals a surprising pattern: The jobs performed primarily by women are relatively safe, while those typically performed by men are at risk.
Put simply: Climate change poses the threat of global catastrophe. The planet isn’t just getting hotter, it’s destabilizing. Entire ecosystems are at risk. The future of humanity is at stake.
Scientists warn that extreme weather will get worse and huge swaths of coastal cities will be submerged by ever-more-acidic oceans. All of which raises a question: If climate change continues at this pace, is anywhere going to be safe?
“Switzerland would be a good guess,” said James Hansen, the director of climate science at Columbia University’s Earth Institute. Hansen’s latest climate study warns that climate change is actually happening faster than computer models previously predicted. He and more than a dozen co-authors found that sea levels could rise at least 10 feet in the next 50 years. Slatepoints out that although the study isn’t yet peer-reviewed, Hansen is “known for being alarmist and also right.”
Over the last few days, biologists, ecologists, and other scientists have been sharing mistakes and mishaps they’ve made in the wilderness: in other words, their #fieldworkfails. They are wonderful. I’ve posted some below, but I also emailed some of the participants to find out more about their misadventures.
“I glued my finger to the croc while attaching a transmitter with an instant glue,” Staniewicz, now a Ph.D. student at the University of Bristol, told me. “And then [I] spent a couple of minutes carefully detaching my finger from the croc and trying to keep the transmitter fastened while the local fishermen watched and laughed.”
Writing used to be a solitary profession. How did it become so interminably social?
Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.