Here is something other than The Sequester to think about at the beginning of March:
This month marks ten years since the U.S. launched its invasion of Iraq. In my view this was the biggest strategic error by the United States since at least the end of World War II and perhaps over a much longer period. Vietnam was costlier and more damaging, but also more understandable. As many people have chronicled, the decision to fight in Vietnam was a years-long accretion of step-by-step choices, each of which could be rationalized at the time. Invading Iraq was an unforced, unnecessary decision to risk everything on a "war of choice" whose costs we are still paying.
My reasons for bringing this up:
1) Reckoning. Anyone now age 30 or above should probably reflect on what he or she got right and wrong ten years ago.
I feel I was right in arguing, six months before the war in "The Fifty-First State," that invading Iraq would bring on a slew of complications and ramifications that would take at least a decade to unwind.
I feel not "wrong" but regretful for having resigned myself even by that point to the certainty that war was coming. We know, now, that within a few days of the 9/11 attacks many members of the Bush Administration had resolved to "go to the source," in Iraq. Here at the magazine, it was because of our resigned certainty about the war that Cullen Murphy, then serving as editor, encouraged me in early 2002 to begin an examination of what invading and occupying Iraq would mean. The resulting article was in our November, 2002 issue; we put it on line in late August in hopes of influencing the debate.
My article didn't come out and say as bluntly as it could have: we are about to make a terrible mistake we will regret and should avoid. Instead I couched the argument as cautionary advice. We know this is coming, and when it does, the results are going to be costly, damaging, and self-defeating. So we should prepare and try to diminish the worst effects (for Iraq and for us). This form of argument reflected my conclusion that the wheels were turning and that there was no way to stop them. Analytically, that was correct: Tony Blair or Colin Powell might conceivably have slowed the momentum, if either of them had turned anti-war in time, but few other people could have. Still, I'd feel better now if I had pushed the argument even harder at the time.
For the record, Michael Kelly, who had been editor of the magazine and was a passionate advocate of the need for war, allowed us to undertake this project and put it on the cover even though he disagreed. Soon thereafter he was in Iraq, as an embedded reporter with the 3rd Infantry Division; in an incredible tragedy he was killed during the invasion's early phase.
2) Accountability. For a decade or more after the Vietnam war, the people who had guided the U.S. to disaster decently shrank from the public stage. Robert McNamara did worthy penance at the World Bank. Rusk, Rostow, Westmoreland were not declaiming on what the U.S. should and should not do.
After Iraq, there has been a weird amnesty and amnesia about people's misjudgment on the most consequential decision of our times. Hillary Clinton lost the 2008 primary race largely because she had been "wrong" on Iraq and Barack Obama had been "right." But Cheney, Rumsfeld, Wolfowitz, Bremer, Rice, McCain, Abrams, and others including the pro-war press claque are still offering their judgments unfazed. In his post-presidential reticence George W. Bush has been an honorable exception.
I don't say these people should never again weigh in. But there should be an asterisk on their views, like the fine print about side effects in pharmaceutical ads.
3) Honor. Say this for Al Gore: He was forthright, he was early, and he was right about Iraq.
4) Liberal hawks. Say this about the "liberal hawk" faction of 2002-2003: unlike, say, Peter Beinart, not enough of them have reckoned with what they got wrong then, and how hard many of them were pushing the "justice" and "duty" to invade, not to mention its feasibility. It would be good to hear from more of them, ten years on.
5) Threat inflation. As I think about this war and others the U.S. has contemplated or entered during my conscious life, I realize how strong is the recurrent pattern of threat inflation. Exactly once in the post-WW II era has the real threat been more ominous than officially portrayed. That was during the Cuban Missile Crisis in 1962, when the world really came within moments of nuclear destruction.
Otherwise: the "missile gap." The Gulf of Tonkin. The overall scale of the Soviet menace. Iraq. In each case, the public soberly received official warnings about the imminent threat. In cold retrospect, those warnings were wrong -- or contrived, or overblown, or misperceived. Official claims about the evils of these systems were many times justified. Claims about imminent threats were most of the times hyped.
Which brings me to:
6) Iran. Most of the people now warning stridently about the threat from Iran warned stridently about Iraq ten years ago. That doesn't prove they are wrong this time too. But it's a factor to be weighed. Most of the technical warnings we are getting about Iran's capabilities are like those we got about Saddam's. That doesn't prove they are wrong again. But it's a factor.
Purportedly authoritative inside reports, replete with technical details about "yellowcake" or aluminum tubes, had an outsized role in convincing people of the threat from Iraq. We wish now that more people had looked harder at those claims. If you'd like to see someone looking hard at similar technical claims about Iran, please check out the Bulletin of the Atomic Scientists, where Youssaf Butt argues that the latest warnings mean less than they seem. Also from the Bulletin, a previous debunking, and a proposal for a negotiated endgame with Iran.
Again: like most of humanity, I can't judge these nuclear-technology arguments myself. But the long history of crying-wolf hyped warnings, in some cases by the same people now most alarmist about Iran, puts a major burden of proof on those claiming imminent peril.
7) Clarity. I said earlier that I regretted not being more direct and blatant in saying: Don't go into Iraq. For more than eight years, I've tried to argue very directly that a preemptive military strike on Iran would be an enormous mistake on all levels for either Israel or the United States. Strategically it could only cement-in Iranian hostility for the long run. Tactically every professional soldier -- Israeli, American, or otherwise -- who has examined the practicalities of such a mission has warned that it would be folly.
Lest the soldiers seem too gloomy, several U.S. Senators are working on a resolution committing the U.S. to lend its military and diplomatic support if PM Netanyahu decides, against the advice of most of his own military establishment, to attack. It would be bad enough if Netanyahu got his own country into this bind; there is no precedent for the U.S. delegating to any ally the decision to commit our troops to an attack. It would be different from NATO-style treaty obligations for mutual defense.
There is more ahead about Israeli, Iranian, and American negotiating strategies, but this is enough for now. It's also as much as I can manage before recovering from the flight from DC to Beijing.
James Fallows is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. He has reported extensively from outside the United States and once worked as President Carter's chief speechwriter. His latest book is China Airborne.
People labeled “smart” at a young age don’t deal well with being wrong. Life grows stagnant.
ASPEN, Colo.—At whatever agesmart people develop the idea that they are smart, they also tend to develop vulnerability around relinquishing that label. So the difference between telling a kid “You did a great job” and “You are smart” isn’t subtle. That is, at least, according to one growing movement in education and parenting that advocates for retirement of “the S word.”
The idea is that when we praise kids for being smart, those kids think: Oh good, I'm smart. And then later, when those kids mess up, which they will, they think: Oh no, I'm not smart after all. People will think I’m not smart after all. And that’s the worst. That’s a risk to avoid, they learn.“Smart” kids stand to become especially averse to making mistakes, which are critical to learning and succeeding.
The untold story of the improbable campaign that finally tipped the U.S. Supreme Court.
On May 18, 1970, Jack Baker and Michael McConnell walked into a courthouse in Minneapolis, paid $10, and applied for a marriage license. The county clerk, Gerald Nelson, refused to give it to them. Obviously, he told them, marriage was for people of the opposite sex; it was silly to think otherwise.
Baker, a law student, didn’t agree. He and McConnell, a librarian, had met at a Halloween party in Oklahoma in 1966, shortly after Baker was pushed out of the Air Force for his sexuality. From the beginning, the men were committed to one another. In 1967, Baker proposed that they move in together. McConnell replied that he wanted to get married—really, legally married. The idea struck even Baker as odd at first, but he promised to find a way and decided to go to law school to figure it out.
As he prepares for a presidential run, the governor’s labor legacy deserves inspection. Are his state’s “hardworking taxpayers” any better off?
This past February, at the Conservative Political Action Conference (CPAC) outside Washington, D.C., Wisconsin Governor Scott Walker rolled up his sleeves, clipped on a lavalier microphone, and without the aid of a teleprompter gave the speech of his life. He emerged from that early GOP cattle call as a front-runner for his party’s nomination for president. Numerous polls this spring placed him several points ahead of former Florida Governor Jeb Bush, the preferred candidate of the Republican establishment, in Iowa and New Hampshire. Those same polls showed him with an even more substantial lead over movement conservative favorites such as Ted Cruz, Rand Paul, and Mike Huckabee. In late April, the Koch brothers hinted that Walker would be the likely recipient of the nearly $900 million they plan to spend on the 2016 election cycle.
Mike Huckabee and Ted Cruz are suggesting there might be ways for states and cities to nullify the justices’ ruling. They’re wrong.
The Supreme Court’s decision last week did make gay marriage legal around the nation. Unfortunately for social conservatives, it did not, however, make nullification legal around the nation.
Nullification is the historical idea that states can ignore federal laws, or pass laws that supercede them. This concept has a long but not especially honorable pedigree in U.S. history. Its origins date back to antebellum America, where Southern states tried to nullify tariffs and Northern states tried to nullify fugitive-slave laws. In the 1950s, after Brown v. Board of Education, some Southern states tried to pass laws to avoid integrating schools. It didn’t work, because nullification is not constitutional.
Was the Concorde a triumph of modern engineering, a metaphor for misplaced 20th-century values, or both?
The box sat untouched in his bottom desk drawer. For weeks we discussed opening it, and one January morning he was ready. I set the box on his white bedsheets and removed the stack of passports, which could have belonged to a family with dual citizenship. But all nine—from 1956 to a valid update issued in 2014—belong to my 89-year-old grandfather.
Lying in bed, he unfolded a stamp-covered page like an accordion and held it open above his chest. “Oh my,” he kept repeating. He paused, and pointed.
London. March 22, 1976. My then-50-year-old grandfather, Raymond Pearlson, the inventor ofSyncrolift, was traveling the world selling his shiplift system. Concorde had launched commercially that January. He knew exactly what this stamp represented: Washington Dulles to London Heathrow in 3.5 hours—the first of at least 150 supersonic flights he took on the legendary aircraft.
The social network learns more about its users than they might realize.
Facebook, you may have noticed, turned into a rainbow-drenched spectacle following the Supreme Court’s decision Friday that same-sex marriage is a Constitutional right.
By overlaying their profile photos with a rainbow filter, Facebook users began celebrating in a way we haven't seen since March 2013, when 3 million peoplechanged their profile images to a red equals sign—the logo of the Human Rights Campaign—as a way to support marriage equality. This time, Facebook provided a simple way to turn profile photos rainbow-colored. More than 1 million people changed their profile in the first few hours, according to the Facebook spokesperson William Nevius, and the number continues to grow.
“This is probably a Facebook experiment!” joked the MIT network scientist Cesar Hidalgo on Facebook yesterday. “This is one Facebook study I want to be included in!” wrote Stacy Blasiola, a communications Ph.D. candidate at the University of Illinois, when she changed her profile.
Many authors have been tempted into writing revisionist histories of the 37th U.S. president, but these counterintuitive takes often do not hold up under closer scrutiny.
Every once in a while someone writes a book arguing that Richard Nixon has been misunderstood. These authors tend to focus on some particular aspect of his presidency that, the argument goes, is more important than that Watergate business. They’ve focused on his domestic policy or his foreign policy as achievements that override his flaws and his presidency’s denouement. Nixon’s highly complex persona also has led to books that probe his psyche—a hazardous and widely debunked practice, though that hasn’t discouraged further attempts.
And, as with other major figures, but all the more so given the drama of his time on the national stage, Nixon’s complexity and essentially low repute tempts some authors to offer revisionist approaches to his place in history. Such approaches have to be assessed on their own merits, not accepted merely because they’re counterintuitive or receive a lot of attention, as new assessments of the controversial and fascinating Nixon tend to do. Two major revisionist books about Nixon argued that his domestic policy was so expansive, humane, and innovative that it overrides his unfortunate behavior; their accounts relegate Watergate to a far less important role. The problem with these books is that they don’t stand up to close scrutiny.
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
The commonwealth is facing a serious debt crisis that could result in default, but that’s only part of the problem.
Updated on June 30, 2015
Puerto Rico is a small island with some big financial problems. Governor Alejandro Garcia Padilla recently told TheNew York Times that there was no way the island, which has been struggling with about $72 billion of debt, would be able to pay, and instead would try to work out new deals and deferred payments with some of its creditors. This, of course, has lead to fears that the commonwealth will default on its loans.
The admission that Puerto Rico’s finances are much worse than originally thought was spurred by areport commissioned by the Government Development Bank, an agency tasked with developing economic and financial strategies for the commonwealth, and conducted by current and former IMF staffers. The report, nicknamed The Krueger Plan for its lead author Anne Krueger, doesn’t mince words when it comes to the outlook for the debt-laden island: "Structural problems, economic shocks and weak public finances have yielded a decade of stagnation, outmigration and debt. Financial markets once looked past these realities but have since cut off the commonwealth from normal market access. A crisis looms.”