The war against al-Qaeda is over, but continuing to fight terrorism will require understanding what we did that worked - and what didn't
New York police stand near a wanted poster for Osama bin Laden, in this file photo from September 18, 2001 / Reuters
Ten years into our struggle against al-Qaeda, it's time to acknowledge that the "war" is over and recognize that the United States and its international partners overreacted to the al-Qaeda threat. Terrorism, after all, is designed to elicit such overreactions. But the confluence of the recent death of bin Laden, harsh new economic realities, the democratic movements in the Middle East, and the ten-year anniversary of the September 11 attacks provide an ideal time to take stock of what it actually takes to deal with the al-Qaeda threat.
The Failure of Al-Qaeda
The immediate physical threat posed by al-Qaeda has diminished greatly over the past ten years. The elimination of Osama bin Laden -- a long-overdue counterterrorism triumph -- and the relentless dismantling of al-Qaeda's senior leadership in their Pakistani sanctuaries and redoubts are obvious but powerful signs of the enterprise's darkening prospects. The recent death of one of al-Qaeda's most capable and influential senior leaders, Abu Abd al-Rahman Atiyyatallah, in an alleged U.S. drone attack in Pakistan, will only hasten its leadership's collapse.
More important, al Qaeda has failed utterly in its efforts to achieve one of its paramount political objectives. From the 19th century through the present day, terrorists and insurgents -- from transatlantic anarchists to Fanonists of the tiers monde to Nepalese Maoists -- have spun insurrectionist fantasies of taking over. But the Salafist-jihadists' worldwide Islamic uprising, against perceived enemies of the faith, never materialized. The Muslim masses have refused to play their part in the al-Qaeda dramaturgy. The terrorism intended to generate widespread rebellion has failed to arouse a global Muslim community. Most damningly, al-Qaeda has been irrelevant to the popular uprisings sweeping the heartland of the Muslim world.
Rethinking How We Fight Terrorism
In recognizing al-Qaeda's failures and weaknesses, we should reevaluate the political, military, economic, and other instruments the United States wields against terrorism. Three of these methods need particular scrutiny.
The first is social and economic development. It might be useful in dealing with large-scale insurgencies, but development is unlikely to address the idiosyncratic motives of the small number of people who join terrorist groups. It's true that addressing the "root causes" of terrorism sounds like a sensible, systemic course of action, but few truly agree what those causes are -- nor is there anything like a consensus on what measures are likely to prove most effective.
The second questionable tool is one used in part of a broader set of information operations: positive messaging about the United States. There are excellent reasons to pursue public diplomacy, but countering terrorism is not one of them. The young people who are vulnerable to al-Qaeda's recruitment pitches are likely to be impervious to positive messages about the United States. In addition, linking public diplomacy with counterterrorism risks alienating intended audiences, which can easily detect the fear and hidden agenda lurking behind the friendly American smile. The United States needs to dissuade people from attacking its citizens -- but those people do not need to like the United States in order to abandon violence.
The third tool to drop is the one with which we've had the least success: occupying the country from which a terrorist group is attempting to recruit. There might be good reasons to invade and occupy a country, but eliminating a terrorist group is not one of them. It only engenders new recruits for the terrorists' cause and it provides them a fertile training ground. Moreover, it plays into al-Qaeda's openly professed strategy of bleeding U.S. resources to force it to reduce its influence in the Middle East.
What Works in Counterterrorism
What's left in the counter-terrorist's toolkit? Most of the significant advances against al-Qaeda and its fellow travelers over the last ten years have come as a consequence of intelligence gathering, good policing, spreading the awful truth about al-Qaeda, and helping other governments do these same things. These are not ancillary to counterterrorism but rather its essential components.
Violent operations against al-Qaeda have garnered most of the public's attention. But, in terms of preventing terrorist attacks, the most powerful weapon has been decidedly unglamorous and much less visible: police work informed by well-placed sources inside terrorist cells. Major plots in New York, London, Stockholm, and other key urban centers have been foiled by police, often working in unison with intelligence services. Assisting foreign police forces should be a major component of the U.S. counterterrorism repertoire -- but such aid is limited by considerable restrictions from Congress and a lack of skilled police trainers able and willing to work abroad.
Eliminating terrorist networks is not enough. They also have to be discredited among the audiences they seek to influence. Although it is true that al-Qaeda has done much to discredit itself through its doctrinal and operational excesses -- killing civilians, attacking places of worship, targeting fellows Muslims -- the U.S. and its allies have done an excellent job of magnifying those excesses. Two effective techniques have been releasing private correspondence between al-Qaeda's senior leaders, which is rarely flattering, and quietly pointing the media to evidence that al-Qaeda does not represent the aspirations of the vast majority of Muslims.
Not only has the U.S. become adept at using these tools, it has also been skillful in showing others how to use them. For example, Indonesia, once a fertile ground for militant Islamist activity, is now a counterterrorism success stories because of these efforts.
Given the considerable damage that "kinetic" military operations have reportedly done to al-Qaeda, military and paramilitary force should obviously remain an important part of the counterterrorist arsenal. But it should be reserved only for killing the most senior leaders and operatives in a terrorist organization -- those whose skills are most lethal and most difficult to replace -- and only when local security forces are unable or unwilling to take appropriate action. This does not require occupying a country, but rather cultivating local allies and spending money to develop intelligence networks.
The War is Over
There will inevitably one day be another large attack on American soil and the U.S. government will inevitably overreact, That is the response terrorism is design to elicit and the United States, because its safety and isolation make terrorism feel so horrifying, is particularly susceptible to such a response. But if Washington can use this 10-year landmark to throw out the counterterrorism tools that haven't worked and to sharpen the ones that do, the negative consequences of that overreaction will be minimal. If not, the United States will have drawn the wrong lessons from the last ten years, obliging its terrorist enemies by repeating its worst mistakes.
A new film details the reason the star postponed her recent tour—and will test cultural attitudes about gender, pain, and pop.
“Pain without a cause is pain we can’t trust,” the author Leslie Jamison wrote in 2014. “We assume it’s been chosen or fabricated.”
Jamison’s essay “Grand Unified Theory of Female Pain” unpacked the suffering-woman archetype, which encompasses literature’s broken hearts (Anna Karenina, Miss Havisham) and society’s sad girls—the depressed, the anorexic, and in the 19th century, the tubercular. Wariness about being defined by suffering, she argued, had led many modern women to adopt a new pose. She wrote, “The post-wounded woman conducts herself as if preempting certain accusations: Don’t cry too loud; don’t play victim.” Jamison questioned whether this was an overcorrection. “The possibility of fetishizing pain is no reason to stop representing it,” she wrote. “Pain that gets performed is still pain.”
Girls in the Middle East do better than boys in school by a greater margin than almost anywhere else in the world: a case study in motivation, mixed messages, and the condition of boys everywhere.
Jordan has never had a female minister of education, women make up less than a fifth of its workforce, and women hold just 4 percent of board seats at public companies there. But, in school, Jordanian girls are crushing their male peers. The nation’s girls outperform its boys in just about every subject and at every age level. At the University of Jordan, the country’s largest university, women outnumber men by a ratio of two to one—and earn higher grades in math, engineering, computer-information systems, and a range of other subjects.
In fact, across the Arab world, women now earn more science degrees on a percentage basis than women in the United States. In Saudi Arabia alone, women earn half of all science degrees. And yet, most of those women are unlikely to put their degrees to paid use for very long.
What feels like information overload reveals how little the public actually knows about the probe's findings.
Robert Mueller has stayed busy with his special-counsel investigation all summer, but the rest of Washington took a vacation. And since most information about Mueller’s actions seems to come from leaks outside the Mueller team, that meant there was a stretch of relative silence.
But the lull is over now. The month of September, and particularly the last week, have seen a torrent of new revelations about Mueller’s investigation. The fresh information gives the most complete view of what Mueller is up to and where he might be focusing, and in particular on the person of Paul Manafort, who chaired Donald Trump’s presidential campaign during the summer of 2016. Yet even as they suggest the direction in which the probe is headed at the moment, they don’t offer much insight into the ultimate questions of when Mueller might wrap up and what, if any, charges he might bring or recommend. So where does that leave things?
The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.
It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.
Long after research contradicts common medical practices, patients continue to demand them and physicians continue to deliver. The result is an epidemic of unnecessary and unhelpful treatments.
First, listen to the story with the happy ending: At 61, the executive was in excellent health. His blood pressure was a bit high, but everything else looked good, and he exercised regularly. Then he had a scare. He went for a brisk post-lunch walk on a cool winter day, and his chest began to hurt. Back inside his office, he sat down, and the pain disappeared as quickly as it had come.
That night, he thought more about it: middle-aged man, high blood pressure, stressful job, chest discomfort. The next day, he went to a local emergency department. Doctors determined that the man had not suffered a heart attack and that the electrical activity of his heart was completely normal. All signs suggested that the executive had stable angina—chest pain that occurs when the heart muscle is getting less blood-borne oxygen than it needs, often because an artery is partially blocked.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
What J.R.R. Tolkien’s classic The Hobbit still has to offer, 80 years after its publication
“In a hole in the ground there lived a hobbit.” So began the legendarium that dominated a genre, changed Western literature and the field of linguistics, created a tapestry of characters and mythology that endured four generations, built an anti-war ethos that endured a World War and a Cold War, and spawned a multibillion-dollar media franchise. J.R.R. Tolkien’s work is probably best remembered today by the sword-and-sandal epic scale of The Lord of The Rings films, but it started in the quiet, fictionalized English countryside of the Shire. It started, 80 years ago in a hobbit-hole, with Bilbo Baggins.
Although Tolkien created the complicated cosmological sprawl of The Silmarillion and stories like the incestuous saga of Túrin Turambar told in The Children of Húrin, Middle-earth itself is mostly remembered today as something akin to little Bilbo in his Hobbit-hole: quaint, virtuous, and tidy. Nowadays, George R.R. Martin’s got the market cornered on heavily initialed fantasy writers, and his hand guides the field. High and epic fantasy are often expected to dip heavily into the medieval muck of realism, to contain heavy doses of sex and curses, gore and grime, sickness and believable motives and set pieces. Characters like Martin’s mercenary Bronn of the Blackwater are expected to say “fuck,” and to like fucking. Modern stories, even when set in lands like A Song of Ice and Fire’s Essos that are filled with competing faiths, tend toward the nihilist, and mostly atheist. Heavenly beings are denuded of potency and purity; while the gods may not be dead, divinity certainly is.
Physicians rarely agree on anything as strongly as they do that the Graham-Cassidy health-care bill is harmful.
It used to be that when a doctor gave a confident recommendation, patients trusted it. A skeptical person might seek a second opinion, or a third. When they all agreed, the best course seemed clear.
Today, America’s major physician organizations are recommending something, strongly and in unison: The latest health-care bill, known as Graham-Cassidy, would do harm to the country and should be defeated.
Coalitions of health professionals that have spoken publicly against the measure so far include the American Medical Association (“Provisions violate longstanding AMA policy”), the American Psychiatric Association (“This bill harms our most vulnerable patients”), the American Public Health Association (“Graham-Cassidy would devastate the Medicaid program, increase out-of-pocket costs, and weaken or eliminate protections for people living with preexisting conditions”), the National Institute for Reproductive Health (“the Graham-Cassidy bill preys on underserved communities ... a clear and present danger”), and Federation of American Hospitals (“It could disrupt access to health care for millions of the more than 70 million Americans”).
I have been studying the French language, with some consistency, for three years. This field of study has been, all at once, the hardest and most rewarding of my life. I would put it above the study of writing simply because I started writing as a 6-year-old boy under my mother's tutelage. I always "felt" I could write. I did not always "feel" I could effectively study a foreign language.
But here I am, right now, in a Montreal hotel. I spoke French at the border. I spoke French when I checked in. I spoke French when I went to get lunch. I don't really believe in fluency. If there is a such thing, I don't have it. I mishear words. I confuse tenses. I can't really use the subjunctive. Yet.
Something has happened to me and the something is this—I have gotten better. I don't know when I first felt it. I didn't feel it this summer at Middlebury, despite the difference in my entrance and exit scores. I didn't feel it when I first arrived in Paris in January. I felt, as I always feel, like I was stumbling around in the dark. I still feel like that. But I also feel like I am getting better at stumbling.
Its faith-based 12-step program dominates treatment in the United States. But researchers have debunked central tenets of AA doctrine and found dozens of other treatments more effective.
J.G. is a lawyer in his early 30s. He’s a fast talker and has the lean, sinewy build of a distance runner. His choice of profession seems preordained, as he speaks in fully formed paragraphs, his thoughts organized by topic sentences. He’s also a worrier—a big one—who for years used alcohol to soothe his anxiety.
J.G. started drinking at 15, when he and a friend experimented in his parents’ liquor cabinet. He favored gin and whiskey but drank whatever he thought his parents would miss the least. He discovered beer, too, and loved the earthy, bitter taste on his tongue when he took his first cold sip.
His drinking increased through college and into law school. He could, and occasionally did, pull back, going cold turkey for weeks at a time. But nothing quieted his anxious mind like booze, and when he didn’t drink, he didn’t sleep. After four or six weeks dry, he’d be back at the liquor store.