Ten years ago today the U.S. began its invasion of Iraq. I argue that it was the worst strategic mistake since the end of World War II, and probably the biggest "unforced error" in American history.
Even as I've been ladling out the 10-years-after installments, I have very little faith or even hope that this ruinous decision will prove "instructive" in any way. Here is why:
1) Avoidance. After Pearl Harbor, after Vietnam, after World War II, after the 9/11 attacks, even after civilian disasters like the Challenger explosion or Katrina, there were official efforts, of varying seriousness and success, to find out what had gone wrong, and why, and to yield "lessons learned."
'Like infants, they live in a continuous present'
That hasn't happened this time, for a lot of reasons. For the Bush Administration, there was no "failure" to be examined and explained. For the Obama Administration, the point was to "look forward not back."
People in the media and politics who were against the war know that it can grow tiresome to keep pointing that out. Example: Barack Obama would not be president today if he had not given a speech in Chicago in October 2002, saying that he (as a mere state senator) did not oppose all wars but was against a "dumb" and "rash" war in Iraq. Listen to how he talked in those days! He denounced "the cynical attempt by Richard Perle and Paul Wolfowitz and other armchair, weekend warriors in this administration to shove their own ideological agendas down our throats, irrespective of the costs in lives lost and in hardships borne." Because of that speech, six years later Obama could argue that his judgment had been right, and the vastly more experienced HIllary Clinton's had been wrong, about matters of war and peace. But there's no percentage for him in bringing that up now.
People in the media who were for the war have, with rare and admirable exceptions, avoided looking back. The Washington Post's editorial page was one of the most strident pro-war voices, part of a claque creating -- as I recall and noted at the time -- a kind of war frenzy in the capital. There is not a word about Iraq on its editorial page today (at right, but check it out for yourself). Say this for Paul Wolfowitz: While he didn't come close on this past week's talk shows to engaging Andrew Bacevich's challenge [which Harper'shas now opened for non-subscribers], at least he recognized Iraq as a question he would have to address. George Packer was one of several influential "liberal hawks" who were making a pro-war case in the New Yorker. I view, and viewed, that era and its choices very differently from him. (For instance he now says, "Spending a lot of time in Iraq did not make you" -- meaning himself -- "more keenly aware of America's larger strategic interests. It rendered you less likely to ask the essential questions about the inception of the war.") But I am glad he addresses the issue today.
2) The 'continuous present' Our friend Mike Lofgren argues in the Huffington Post that all factions in politics and the media have not simply "failed" to learn. They live in a system that rewards not learning. For instance, he says:
Aside from its inordinate fiscal and human cost, deposing Saddam Hussein and installing a Shia-led government has had the effect of strengthening the regional position of Iran. But having built up the Iranian bogey through its own stupidity, the U.S. political establishment is now contemplating how to coerce Teheran. This refusal to see the consequences of one's actions, and then using the disastrous result as an excuse to do the same thing again, is a recurring pattern of American statecraft.
One can hypothesize that our leaders see world events as discrete and unconnected with anything that happened before; like infants, they live in a continuous present.
3) The recurring pattern of error. When politicians and the media were "wrong" about Iraq, what did wrongness entail? Reduced to its essence it meant:
Exaggerating the scale and imminence of a threat from Iraq;
Growing testily impatient with any solutions other than the "kinetic" (e.g., from TNY 10 years ago, "a return to a hollow pursuit of containment will be the most dangerous option of all.");
Grossly underestimating the difficulty of "removing" that threat with military force;
Showing a failure of tragic imagination (different from a tragic failure of imagination, which was also true) about the ripple effects and long-term costs and consequences of taking a clear and "decisive" step now.
If we were to "learn" from mistakes, we might avoid this specific set of biases and miscalibrations when it comes to another "preventive" strike against another threatening nation in exactly the same part of the world. But we see every one of these four elements of this syndrome -- exaggeration, impatience, polyanna-ism about military measures, naivete about long-term effects - in discussions about the "need" and "moral duty" to condone military action against Iran.
Of course Iran and Iraq are different; the challenges are different; the details of military action are different. But the similarities are even greater -- and whether we can bear them in mind as we contemplate the "next war" will say a lot about whether it is ever possible to learn.
James Fallows is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. He has reported extensively from outside the United States and once worked as President Carter's chief speechwriter. His latest book is China Airborne.
A new report details a black market in nuclear materials.
On Wednesday, the Associated Press published a horrifying report about criminal networks in the former Soviet Union trying to sell “radioactive material to Middle Eastern extremists.” At the center of these cases, of which the AP learned of four in the past five years, was a “thriving black market in nuclear materials” in a “tiny and impoverished Eastern European country”: Moldova.
It’s a new iteration of an old problem with a familiar geography. The breakup of the Soviet Union left a superpower’s worth of nuclear weapons scattered across several countries without a superpower’s capacity to keep track of them. When Harvard’s Graham Allison flagged this problem in 1996, he wrote that the collapse of Russia’s “command-and-control society” left nothing secure. To wit:
In a new book, the former Middle East peace negotiator Dennis Ross explores just how close Israel came to attacking Iran, and why Susan Rice accused Benjamin Netanyahu of throwing “everything but the n-word” at Barack Obama.
When Israeli Prime Minister Benjamin Netanyahu arrives in Washington early next month for a meeting with President Obama, he should at least know that he is more popular in the White House than Vladimir Putin. But not by much.
This meeting will not reset the relationship between the two men in any significant way, and not only because Netanyahu has decided to troll Obama by accepting the Irving Kristol Award from the American Enterprise Institute on this same short trip. The meeting between the two leaders will most likely be businesslike and correct, but the gap between the two is essentially unbridgeable. From Netanyahu’s perspective, the hopelessly naive Obama broke a solemn promise to never allow Iran to cross the nuclear threshold. From Obama’s perspective, Netanyahu violated crucial norms of U.S.-Israel relations by publicly and bitterly criticizing an Iran deal that—from Obama’s perspective—protects Israel, and then by taking the nearly unprecedented step of organizing a partisan (and, by the way, losing and self-destructive) lobbying campaign against the deal on Capitol Hill.
Also notable about this brazen show of might is that the missiles traveled through two countries, Iran and Iraq, before hitting their 11 targets in Syria. This means that both countries either gave their permission or simply didn’t confront Putin about the use of their airspace on his birthday.
Somewhere in Europe, a man who goes by the name “Mikro” spends his days and nights targeting Islamic State supporters on Twitter.
In August 2014, a Twitter account affiliated with Anonymous, the hacker-crusader collective, declared “full-scale cyber war” against ISIS: “Welcome to Operation Ice #ISIS, where #Anonymous will do it’s [sic] part in combating #ISIS’s influence in social media and shut them down.”
In July, I traveled to a gloomy European capital city to meet one of the “cyber warriors” behind this operation. Online, he goes by the pseudonym Mikro. He is vigilant, bordering on paranoid, about hiding his actual identity, on account of all the death threats he has received. But a few months after I initiated a relationship with him on Twitter, Mikro allowed me to visit him in the apartment he shares with his girlfriend and two Rottweilers. He works alone from his chaotic living room, using an old, battered computer—not the state-of-the-art setup I had envisaged. On an average day, he told me, he spends up to 16 hours fixed to his sofa. He starts around noon, just after he wakes up, and works late into the night and early morning.
It leaves people bed-bound and drives some to suicide, but there's little research money devoted to the disease. Now, change is coming, thanks to the patients themselves.
This past July, Brian Vastag, a former science reporter, placed an op-ed with his former employer, the Washington Post. It was an open letter to the National Institutes of Health director Francis Collins, a man Vastag had formerly used as a source on his beat.
“I’ve been felled by the most forlorn of orphan illnesses,” Vastag wrote. “At 43, my productive life may well be over.”
There was no cure for his disease, known by some as chronic fatigue syndrome, Vastag wrote, and little NIH funding available to search for one. Would Collins step up and change that?
“As the leader of our nation’s medical research enterprise, you have a decision to make,” he wrote. “Do you want the NIH to be part of these solutions, or will the nation’s medical research agency continue to be part of the problem?”
Why Americans tend more and more to want inexperienced presidential candidates
The presidency, it’s often said, is a job for which everyone arrives unprepared. But just how unprepared is unprepared enough?
Political handicappers weigh presidential candidates’ partisanship, ideology, money, endorsements, consultants, and, of course, experience. Yet they too rarely consider an element of growing importance to voters: freshness. Increasingly, American voters view being qualified for the presidency as a disqualification.
In 2003, I announced in National Journal the 14-Year Rule. The rule was actually discovered by a presidential speechwriter named John McConnell, but because his job required him to keep his name out of print, I graciously stepped up to take credit. It is well known that to be elected president, you pretty much have to have been a governor or a U.S. senator. What McConnell had figured out was this: No one gets elected president who needs longer than 14 years to get from his or her first gubernatorial or Senate victory to either the presidency or the vice presidency.* Surprised, I scoured the history books and found that the rule works astonishingly well going back to the early 20th century, when the modern era of presidential electioneering began.
Forget the Common Core, Finland’s youngsters are in charge of determining what happens in the classroom.
“The changes to kindergarten make me sick,” a veteran teacher in Arkansas recently admitted to me. “Think about what you did in first grade—that’s what my 5-year-old babies are expected to do.”
The difference between first grade and kindergarten may not seem like much, but what I remember about my first-grade experience in the mid-90s doesn’t match the kindergarten she described in her email: three and a half hours of daily literacy instruction, an hour and a half of daily math instruction, 20 minutes of daily “physical activity time” (officially banned from being called “recess”) and two 56-question standardized tests in literacy and math—on the fourth week of school.
That American friend—who teaches 20 students without an aide—has fought to integrate 30 minutes of “station time” into the literacy block, which includes “blocks, science, magnetic letters, play dough with letter stamps to practice words, books, and storytelling.” But the most controversial area of her classroom isn’t the blocks nor the stamps: Rather, it’s the “house station with dolls and toy food”—items her district tried to remove last year. The implication was clear: There’s no time for play in kindergarten anymore.
“If the office is going to become a collection of employees not working together, it essentially becomes no different than a coffee shop.”
There’s plenty of research out there on the benefits of remote and flexible work. It’s been shown to lead to increased productivity, and has an undeniable benefit for work-life balance. But what does it do to everyone back at the office?
In a 2013 memo to workers explaining why the company was eliminating policies that allowed remote work, Jackie Reses, Yahoo’s head of human resources,argued that some of the “best decisions and insights come from hallway and cafeteria discussion,” and that actual presence in the office encourages better collaboration and communication.
The leaderless GOP begins its search for a speaker anew, starting with a campaign to draft Paul Ryan.
First Eric Cantor. Then John Boehner. Now Kevin McCarthy.
Conservatives in and out of Congress have, within a span of 15 months, tossed aside three of the four men most instrumental in the 2010 victory that gave Republicans their majority in the House. When the leaderless and divided party gathers on Friday to begin anew its search for a speaker, the biggest question will be whether that fourth man, Paul Ryan, will take a job that for the moment, only he can win.
Ryan, the 2012 vice presidential nominee and chairman of the powerful Ways and Means Committee, has for years resisted entreaties to run for speaker, citing the demands of the job on his young family and his desire to run the tax-writing panel, which he has called “his dream job.” And he did so again on Thursday, within minutes of McCarthy’s abrupt decision to abandon a race he had been favored to win. “I will not be a candidate for speaker,” Ryan tweeted. Yet the pressure kept coming. Lawmakers brought up his name throughout the day, and there were reports that Boehner himself had personally implored him to change his mind.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.