Here is something other than The Sequester to think about at the beginning of March:
This month marks ten years since the U.S. launched its invasion of Iraq. In my view this was the biggest strategic error by the United States since at least the end of World War II and perhaps over a much longer period. Vietnam was costlier and more damaging, but also more understandable. As many people have chronicled, the decision to fight in Vietnam was a years-long accretion of step-by-step choices, each of which could be rationalized at the time. Invading Iraq was an unforced, unnecessary decision to risk everything on a "war of choice" whose costs we are still paying.
My reasons for bringing this up:
1) Reckoning. Anyone now age 30 or above should probably reflect on what he or she got right and wrong ten years ago.
I feel I was right in arguing, six months before the war in "The Fifty-First State," that invading Iraq would bring on a slew of complications and ramifications that would take at least a decade to unwind.
I feel not "wrong" but regretful for having resigned myself even by that point to the certainty that war was coming. We know, now, that within a few days of the 9/11 attacks many members of the Bush Administration had resolved to "go to the source," in Iraq. Here at the magazine, it was because of our resigned certainty about the war that Cullen Murphy, then serving as editor, encouraged me in early 2002 to begin an examination of what invading and occupying Iraq would mean. The resulting article was in our November, 2002 issue; we put it on line in late August in hopes of influencing the debate.
My article didn't come out and say as bluntly as it could have: we are about to make a terrible mistake we will regret and should avoid. Instead I couched the argument as cautionary advice. We know this is coming, and when it does, the results are going to be costly, damaging, and self-defeating. So we should prepare and try to diminish the worst effects (for Iraq and for us). This form of argument reflected my conclusion that the wheels were turning and that there was no way to stop them. Analytically, that was correct: Tony Blair or Colin Powell might conceivably have slowed the momentum, if either of them had turned anti-war in time, but few other people could have. Still, I'd feel better now if I had pushed the argument even harder at the time.
For the record, Michael Kelly, who had been editor of the magazine and was a passionate advocate of the need for war, allowed us to undertake this project and put it on the cover even though he disagreed. Soon thereafter he was in Iraq, as an embedded reporter with the 3rd Infantry Division; in an incredible tragedy he was killed during the invasion's early phase.
2) Accountability. For a decade or more after the Vietnam war, the people who had guided the U.S. to disaster decently shrank from the public stage. Robert McNamara did worthy penance at the World Bank. Rusk, Rostow, Westmoreland were not declaiming on what the U.S. should and should not do.
After Iraq, there has been a weird amnesty and amnesia about people's misjudgment on the most consequential decision of our times. Hillary Clinton lost the 2008 primary race largely because she had been "wrong" on Iraq and Barack Obama had been "right." But Cheney, Rumsfeld, Wolfowitz, Bremer, Rice, McCain, Abrams, and others including the pro-war press claque are still offering their judgments unfazed. In his post-presidential reticence George W. Bush has been an honorable exception.
I don't say these people should never again weigh in. But there should be an asterisk on their views, like the fine print about side effects in pharmaceutical ads.
3) Honor. Say this for Al Gore: He was forthright, he was early, and he was right about Iraq.
4) Liberal hawks. Say this about the "liberal hawk" faction of 2002-2003: unlike, say, Peter Beinart, not enough of them have reckoned with what they got wrong then, and how hard many of them were pushing the "justice" and "duty" to invade, not to mention its feasibility. It would be good to hear from more of them, ten years on.
5) Threat inflation. As I think about this war and others the U.S. has contemplated or entered during my conscious life, I realize how strong is the recurrent pattern of threat inflation. Exactly once in the post-WW II era has the real threat been more ominous than officially portrayed. That was during the Cuban Missile Crisis in 1962, when the world really came within moments of nuclear destruction.
Otherwise: the "missile gap." The Gulf of Tonkin. The overall scale of the Soviet menace. Iraq. In each case, the public soberly received official warnings about the imminent threat. In cold retrospect, those warnings were wrong -- or contrived, or overblown, or misperceived. Official claims about the evils of these systems were many times justified. Claims about imminent threats were most of the times hyped.
Which brings me to:
6) Iran. Most of the people now warning stridently about the threat from Iran warned stridently about Iraq ten years ago. That doesn't prove they are wrong this time too. But it's a factor to be weighed. Most of the technical warnings we are getting about Iran's capabilities are like those we got about Saddam's. That doesn't prove they are wrong again. But it's a factor.
Purportedly authoritative inside reports, replete with technical details about "yellowcake" or aluminum tubes, had an outsized role in convincing people of the threat from Iraq. We wish now that more people had looked harder at those claims. If you'd like to see someone looking hard at similar technical claims about Iran, please check out the Bulletin of the Atomic Scientists, where Youssaf Butt argues that the latest warnings mean less than they seem. Also from the Bulletin, a previous debunking, and a proposal for a negotiated endgame with Iran.
Again: like most of humanity, I can't judge these nuclear-technology arguments myself. But the long history of crying-wolf hyped warnings, in some cases by the same people now most alarmist about Iran, puts a major burden of proof on those claiming imminent peril.
7) Clarity. I said earlier that I regretted not being more direct and blatant in saying: Don't go into Iraq. For more than eight years, I've tried to argue very directly that a preemptive military strike on Iran would be an enormous mistake on all levels for either Israel or the United States. Strategically it could only cement-in Iranian hostility for the long run. Tactically every professional soldier -- Israeli, American, or otherwise -- who has examined the practicalities of such a mission has warned that it would be folly.
Lest the soldiers seem too gloomy, several U.S. Senators are working on a resolution committing the U.S. to lend its military and diplomatic support if PM Netanyahu decides, against the advice of most of his own military establishment, to attack. It would be bad enough if Netanyahu got his own country into this bind; there is no precedent for the U.S. delegating to any ally the decision to commit our troops to an attack. It would be different from NATO-style treaty obligations for mutual defense.
There is more ahead about Israeli, Iranian, and American negotiating strategies, but this is enough for now. It's also as much as I can manage before recovering from the flight from DC to Beijing.
James Fallows is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. He has reported extensively from outside the United States and once worked as President Carter's chief speechwriter. His latest book is China Airborne.
Heather Armstrong’s Dooce once drew millions of readers. Her blog’s semi-retirement speaks to the challenges of earning money as an individual blogger today.
The success story of Dooce.com was once blogger lore, told and re-told in playgroups and Meetups—anywhere hyper-verbal people with Wordpress accounts gathered. “It happened for that Dooce lady,” they would say. “It could happen for your blog, too.”
Dooce has its origin in the late 1990s, when a young lapsed Mormon named Heather Armstrong taught herself HTML code and moved to Los Angeles. She got a job in web design and began blogging about her life on her personal site, Dooce.com.
The site’s name evolved out of her friends’ AOL Instant-Messenger slang for dude, or its more incredulous cousin, "doooood!” About a year later, Armstrong was fired for writing about her co-workers on the site—an experience that, for a good portion of the ‘aughts, came known as “getting dooced.” She eloped with her now ex-husband, Jon, moved to Salt Lake City, and eventually started blogging full time again.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
Encouraging a focus on white identity is a dangerous approach for a country in which white supremacy has been a toxic force.
Donald Trump and the disaffected white people who make up his base of support have got me thinking about race in America. “Trump presents a choice for the Republican Party about which path to follow––” Ben Domenech writes in an insightful piece at The Federalist, “a path toward a coalition that is broad, classically liberal, and consistent with the party’s history, or a path toward a coalition that is reduced to the narrow interests of identity politics for white people.”
When I was growing up in Republican Orange County during the Reagan and Bush Administrations, lots of white parents sat their kids in front of The Cosby Show, explained that black people are just like white people, and inveighed against judging anyone by the color of their skin rather than the content of their character. The approach didn’t convey the full reality of race as minorities experience it. But it represented a significant generational improvement in race relations.
After a lackluster summer, the famous neurosurgeon is finally surging—but his reliance on the conservative grassroots might be a burden as much as a boon.
The Ben Carson surge that everyone was waiting for is finally here.
The conservative neurosurgeon has been a source of fascination for both the Republican grassroots and the media ever since he critiqued President Obama, who was seated only a few feet away, at the National Prayer Breakfast in 2013. He’s been a steady, if middling, presence in GOP primary polls for most of the year—always earning at least 5 percent, but rarely more than 10. Yet over the last two weeks, Carson has secured a second-place spot after Donald Trump, both nationally and in the crucial opening battleground of Iowa, where he is a favorite of the state’s sizable evangelical community. A Monmouth University poll released this week even showed him tied with Trump for the lead in Iowa, at 23 percent.
According to Franklin, what mattered in business was humility, restraint, and discipline. But today’s Type-A MBAs would find him qualified for little more than a career in middle management.
When he retired from the printing business at the age of 42, Benjamin Franklin set his sights on becoming what he called a “Man of Leisure.” To modern ears, that title might suggest Franklin aimed to spend his autumn years sleeping in or stopping by the tavern, but to colonial contemporaries, it would have intimated aristocratic pretension. A “Man of Leisure” was typically a member of the landed elite, someone who spent his days fox hunting and affecting boredom. He didn’t have to work for a living, and, frankly, he wouldn’t dream of doing so.
Having worked as a successful shopkeeper with a keen eye for investments, Franklin had earned his leisure, but rather than cultivate the fine arts of indolence, retirement, he said, was “time for doing something useful.” Hence, the many activities of Franklin’s retirement: scientist, statesman, and sage, as well as one-man civic society for the city of Philadelphia. His post-employment accomplishments earned him the sobriquet of “The First American” in his own lifetime, and yet, for succeeding generations, the endeavor that was considered his most “useful” was the working life he left behind when he embarked on a life of leisure.
The man who made computers personal was a genius and a jerk. A new documentary wonders whether his legacy can accommodate both realities.
An iPhone is a machine much like any other: motherboard, modem, microphone, microchip, battery, wires of gold and silver and copper twisting and snaking, the whole assembly arranged under a piece of glass whose surface—coated with an oxide of indium and tin to make it electrically conductive—sparks to life at the touch of a warm-blooded finger. But an iPhone, too, is much more than a machine. The neat ecosystem that hums under its heat-activated glass holds grocery lists and photos and games and jokes and news and books and music and secrets and the voices of loved ones and, quite possibly, every text you’ve ever exchanged with your best friend. Thought, memory, empathy, the stuff we sometimes shorthand as “the soul”: There it all is, zapping through metal whose curves and coils were designed to be held in a human hand.
An image of a small child evokes an unfathomably huge tragedy.
I had just dropped my son off at daycare when I opened Twitter and came across a photo that over the next 24 hours would become a totem of the refugee crisis in Europe and the Middle East, and the blight that is the Syrian civil war. The picture would quickly reappear, this time as an earnest social-media meme, at a meeting of dithering UN officials and a gathering of unfeeling Arab leaders: a small Syrian boy in a red shirt, blue shorts, and worn shoes, lying face down in wet sand, his head cocked to one side along a gray, glistening shoreline, his lifeless hands cupped upwards, his knees slightly bent.
My first reaction was despair. My second was: My son sleeps just like that.
The attention this photo has received has generated discomfort as well as indignation—for understandable reasons. There are important ethical questions surrounding the taking or sharing of photos of children, dead or alive, in the media, including questions about the intent of the sharers and the consent of the subject. The scale of the Syrian tragedy is orders of magnitude greater, and infinitely more variegated, than this one picture, or this one victim’s story, can possibly convey. Over the last four and a half years, an estimated 240,000 people have died in the grinding violence, including nearly 12,000 children. More than half of Syria’s pre-war population—half, the proportional equivalent of nearly 170 million Americans—have been forced to flee their homes, spawning the largest exodus of refugees in a generation. Seven hundred and fifty thousand Syrian children won’t be going back to school this fall.
By reorienting the GOP’s foreign-policy debate away from the Middle East, the flamboyant frontrunner took the pact off the front page.
Next week, Donald Trump will join Ted Cruz, Glenn Beck, and others at a rally denouncing the Iran deal. Which is ironic, because Trump is one reason the deal will pass.
Before Trump entered the campaign, foreign policy dominated the Republican presidential race. With Democrats less vulnerable on the economy, and the public growing more progressive on cultural issues like gay marriage, drugs, and crime, the GOP candidates refocused on America’s supposedly collapsing position in the world. As The New York Timesreported in February, “Gruesome killings by the Islamic State, terrorist attacks in Europe and tensions with President Vladimir V. Putin of Russia are reshaping the early Republican presidential race, creating anxiety among party voters and sending potential candidates scrambling to outmuscle one another on foreign policy.”
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”