Once our soldiers leave the theater, all that will remain is a clinical and codified policy of assassination writ large
A man carries a bag over his shoulder as he pulls a suitcase in Kabul
On Wednesday, the New York Timesreported that Pakistani officials, eyeing President Obama's spurious timeline for withdrawal from Afghanistan, are "watching as the war, in their view, goes badly and are waiting for their share of the Afghan spoils." The report added that Pakistan's generals and spymasters "appear to have little incentive to bargain away their demands or to modify their side of the ledger," confident that the president lacks the political will to see the war through. In December, U.S. forces will begin withdrawing from Afghanistan. Those combat troops deployed as part of the surge will come home in September 2012. If there is a strategic military reason for that particular date, David Petraeus is unaware of it. David Axelrod might have a keener insight on the matter.
Last month, Stanley McChrystal told the Council on Foreign Relations that we're just over the 50 percent mark in Afghanistan. The retired general noted that where we're providing security, "The change has been stunning. The ability to move crops around, the ability to apply governance and whatnot, has been good." But that requires boots on the ground and men with rifles. Where the Coalition footprint is light, meanwhile, the Taliban "campaign of assassination is terrifying to people, because it makes everyone feel under threat." During his recent confirmation hearings to take the helm at CIA, General Petraeus called the president's withdrawal plan "a more aggressive formulation, if you will, in terms of the timeline than what we had recommended." In Petraeus-speak, this was the equivalent of banging his shoe on the table.
Ten years ago, who would have thought that victory in Afghanistan meant luring the Taliban to the bargaining table? And who would have been surprised when the Taliban then assassinated our proxy negotiator? (There's no need to reach back ten years; in 2010, the Taliban said point blank that they intended to kill members of the High Peace Council.) With the military security option all but exhausted (and thus unavailable to support the remarkable work of civil affairs teams), and diplomacy a hopeless endeavor, the United States and Afghanistan can now look forward to an eternity of Predator drones primed with Hellfire missiles.
It would be hard to improve on essays by Jane Mayer and Conor Friedersdorf on the immorality of drone warfare. But drone warfare is what we're left with. Sherman famously said, "There's many a boy here today who looks on war as all glory but it is all hell." Small communities know that hell and reel when their sons become men, become infantrymen, and never return from third world wastelands. Military spouses know that hell when chaplains in Class A uniforms knock at the door, hats in hand. Combat veterans know that hell better than anyone. And collectively -- oftentimes tragically -- the results of war inform our culture and serve as society's most effective moderating influence. There are many good reasons to go to war, but when we don't, it's often because we know how terrible a thing it is.
Humanity can be found and understood in the best and worst of war. But drones change the equation. It's the worst kind of war, a frightening new enterprise that we've embraced, celebrate, and laugh about. But there's something dishonorable about it. It's the aerial equivalent of roadside IEDs. It's the only kind of war America seems willing to fight anymore, and that is what we're leaving behind in Afghanistan. To be clear, "fairness" should never be an objective of war. But almost by definition, this is not war. Once our soldiers leave the theater, all that will remain is a clinical and codified policy of assassination writ large, with virtually no public scrutiny. It won't be front-page news when drones vaporize innocents, and it won't be front-page news when drones vaporize al-Qaeda operatives, because we've got no skin in the game. It's just robots hunting ghosts.
How long will Afghans agree to that? Are we even asking? Or will this silent non-war be negotiated with our man in Kabul, who, until he was convenient to this administration, was deemed corrupt and incompetent? And how long will Pakistan allow missiles to materialize from nowhere and leave behind craters and corpses? How about the next government, and what are we prepared to do if they say no? The White House has established a precedent that borders are just fine for the people at Rand McNally, but meaningless in the context of drone warfare. Consent of the Congress is a quaint relic; as proven in Libya, the president doesn't need authorization so long as we get a nice snuff film at the end.
Afghanistan is a war worth seeing through. Last week, I spoke with Michael Yon, a writer who's spent four years, cumulatively, in Iraq and Afghanistan -- three of those in combat. According to Yon, as withdrawal moves from concept to reality, "Many troops see their actions will be for naught. They've done their parts and have succeeded when properly resourced, but they see the presidential decisions for what they are. The unit that I last embedded with, 4-4 Cav, was clearly making progress and they know it, but they also see the light at the end of the tunnel is turned off, and that's due to politics. We waited a long time to get serious here, and never got totally serious."
At any rate, says Yon, "The war is largely forgotten. Soldiers who have been going back on leave and are shocked when many Americans don't realize that there is a no-kidding war going on here. I've done my best to highlight some of them." He adds, "The trajectory of the war favors the enemies. If the president precipitously reduces our footprint, the war will be lost. The good news (for somebody) is that most Americans don't seem to realize that we are still in a war, so they won't realize that we lost."
But at least we fought a war that could be forgotten. As America turns to drone technology, more than ever we will be fighting wars we never knew about in the first place.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
In 12 of 16 past cases in which a rising power has confronted a ruling power, the result has been bloodshed.
When Barack Obama meets this week with Xi Jinping during the Chinese president’s first state visit to America, one item probably won’t be on their agenda: the possibility that the United States and China could find themselves at war in the next decade. In policy circles, this appears as unlikely as it would be unwise.
And yet 100 years on, World War I offers a sobering reminder of man’s capacity for folly. When we say that war is “inconceivable,” is this a statement about what is possible in the world—or only about what our limited minds can conceive? In 1914, few could imagine slaughter on a scale that demanded a new category: world war. When war ended four years later, Europe lay in ruins: the kaiser gone, the Austro-Hungarian Empire dissolved, the Russian tsar overthrown by the Bolsheviks, France bled for a generation, and England shorn of its youth and treasure. A millennium in which Europe had been the political center of the world came to a crashing halt.
Life in Ohio's proud but economically abandoned small towns
Just over a decade ago, Matt Eich started photographing rural Ohio. Largely inhabited by what is now known as the “Forgotten Class” of white, blue-collar workers, Eich found himself drawn to the proud but economically abandoned small towns of Appalachia. Thanks to grants from the Economic Hardship Reporting Project and Getty Images, Eich was able to capture the family life, drug abuse, poverty, and listlessness of these communities. “Long before Trump was a player on the political scene, long before he was a Republican, these people existed and these problems existed,” Eich said. His new book, Carry Me Ohio, published by Strum and Drang, is a collection of these images and the first of four books he plans to publish as part of The Invisible Yoke, a photographic meditation on the American condition. Even with a deep knowledge of the region, Eich was unprepared for the fury and energy that surrounded the election this year. “The anger is overpowering,” he said. “I knew what was going on, and I’m still surprised. I should have listened to the pictures.”
Universities themselves may be contributing to burnout.
With half of all doctoral students leaving graduate school without finishing, something significant and overwhelming must be happening for at least some of them during the process of obtaining that degree. Mental illness is often offered as the standard rationale to explain why some graduate students burn out. Some research has suggested a link between intelligence and conditions such as bipolar disorder, leading some observers to believe many graduate students struggle with mental-health problems that predispose them to burning out.
But such research is debatable, and surely not every student who drops out has a history of mental illness. So, what compels students to abandon their path to a Ph.D.? Could there be other underlying factors, perhaps environmental, that can cause an otherwise-mentally-healthy graduate student to become anxious, depressed, suicidal, or, in rare cases, violent?
President-elect Donald Trump has committed a sharp breach of protocol—one that underscores just how weird some important protocols are.
Updated on December 2 at 7:49 p.m.
It’s hardly remembered now, having been overshadowed a few months later on September 11, but the George W. Bush administration’s first foreign-policy crisis came in the South China Sea. On April 1, 2001, a U.S. Navy surveillance plane collided with a Chinese jet near Hainan Island. The pilot of the Chinese jet was killed, and the American plane was forced to land and its crew was held hostage for 11 days, until a diplomatic agreement was worked out. Sino-American relations remained tense for some time.
Unlike Bush, Donald Trump didn’t need to wait to be inaugurated to set off a crisis in the relationship. He managed that on Friday, with a phone call to the president of Taiwan, Tsai Ing-wen. It’s a sharp breach with protocol, but it’s also just the sort that underscores how weird and incomprehensible some important protocols are.
Comedy-drama series like Fleabag and Transparent show how vulnerability is as important as unlikeability and strength when it comes to portraying fictional women.
In the first episode of the HBO series Enlightened, the show’s heroine, Amy Jellicoe, learns that she’s been fired. She does not take the news well. Within minutes, she goes from pitiable victim, sobbing abjectly in a bathroom stall, to mascara-streaked fury. “Go back to your sad, fucking, little desk,” she sneers at her assistant before tracking her ex-lover and presumed betrayer to the office lobby. “I will destroy you—I will bury you—I will kill you, motherfucker!” she screams at him through the elevator doors that she somehow, in a feat of desperation, manages to pry open.
Though the scene aired five years ago, it’s still a pretty radical few minutes of television, and not just because of the ferocity of Laura Dern’s performance. What feels most striking is the series’ willingness to dramatize an extended scene of female distressfor something other than a moralizing end. In this sense, Enlightened anticipates the Amazon series, Fleabag, which evinces a similar empathy toward a female character in the grip of powerfully negative emotions: anger, sadness, grief, self-doubt, shame. It’s probably no accident the two shows have almost identical promotional stills—close-ups of their protagonist’s makeup-smudged faces, staring directly to camera. Like a number of other female-centric, female-created tragicomedies to have emerged on TV in recent years—Transparent, Girls, Catastrophe, Insecure—the series also share a commitment to more compassionate portrayals of dysfunctional heroines, by suspending judgment even (or especially) when they’re at their worst.
Switching pastas and breads is a small decision that could save lives.
Multigrain is a genius approach to selling both white bread and righteousness. The term crept under the umbrella of health quietly. It wasn’t clear why, exactly. (The grain part? Or the multi?) At least it wasn’t white bread, right?
As many eaters of bread came to understand that white bread is a nutritional equivalent of Pixy Stix—the nutritious, fibrous shell of the wheat having been removed, leaving us with only the inner starch, which our bodies almost instantly turn into sugar—it needed some rebranding.
Multigrain is now often used to imply wholesomeness, a virtue to which it often has no claim. Containing the flour of multiple grains does not mean containing whole grains. When millers leave the grain intact before milling, this is whole grain flour. It contains fiber, appeasing the pancreas and microbes that demand it for optimal performance. So, the term to look for is 100 percent whole wheat. (Or wholegrain, though the grain is usually wheat.)
It’s not because they’re inherently harsher leaders than men, but because they often respond to sexism by trying to distance themselves from other women.
There are two dominant cultural ideas about the role women play in helping other women advance at work, and they are seemingly at odds: the Righteous Woman and the Queen Bee.
The Righteous Woman is an ideal, a belief that women have a distinct moral obligation to have one another’s backs. This kind of sentiment is best typified by Madeleine Albright’s now famous quote, “There is a special place in hell for women who don’t help each other!” The basic idea is that since all women experience sexism, they should be more attuned to the gendered barriers that other women face. In turn, this heightened awareness should lead women to foster alliances and actively support one another. If women don’t help each other, this is an even worse form of betrayal than those committed by men. And hence, the special place in hell reserved for those women.
How much do you really need to say to put a sentence together?
Just as fish presumably don’t know they’re wet, many English speakers don’t know that the way their language works is just one of endless ways it could have come out. It’s easy to think that what one’s native language puts words to, and how, reflects the fundamentals of reality.
But languages are strikingly different in the level of detail they require a speaker to provide in order to put a sentence together. In English, for example, here’s a simple sentence that comes to my mind for rather specific reasons related to having small children: “The father said ‘Come here!’” This statement specifies that there is a father, that he conducted the action of speaking in the past, and that he indicated the child should approach him at the location “here.” What else would a language need to do?
A century ago, millions of Americans banded together in defense of white, Christian America and traditional morality—and most of their compatriots turned a blind eye to the Ku Klux Klan.
On August 8, 1925, more than 50,000 members of the Ku Klux Klan paraded through Washington, D.C. Some walked in lines as wide as 20 abreast, while others created formations of the letter K or a Christian cross. A few rode on horseback. Many held American flags. Men and women alike, the marchers carried banners emblazoned with the names of their home states or local chapters, and their procession lasted for more than three hours down a Pennsylvania Avenue lined with spectators. National leaders of the organization were resplendent in colorful satin robes and the rank and file wore white, their regalia adorned with a circular red patch containing a cross with a drop of blood at its center.
Nearly all of the marchers wore pointed hoods, but their faces were clearly visible. In part, that was because officials would sanction the parade only if participants agreed to walk unmasked. But a mask was not really necessary, as most members of the Klan saw little reason to hide their faces. After all, there were millions of them in the United States.