Addressing the war's failings means talking about policy, but before we do that, a reminder of why it matters.
An Afghan policeman stands guard after an street battle in Kabul last April. (Reuters).
For all that's been written about Afghanistan and the U.S.-led war, there's one big question that still looms: how did we get where we are today? As part of an effort to think this through, I have a new paper at the American Security Project. Its premise is to, as I write in the introduction, "establish a framework for understanding why the Afghanistan war is in
the state it is in, and how policymakers can avoid making similar
missteps in the future." But there's one important thing to stress, above what we got wrong or how fix, but why it's so important to address. And the answer is: lives.
The overwhelming number of articles about how "costly" the war in Afghanistan have been focus on money, which is fine as far as it goes, because we've spent so much money in Afghanistan and received very little for it.
But there is another cost that matters even more: lives. The go-to source for understanding how many have died in Afghanistan is iCasualties.org, where the count on coalition soldiers killed stands at just over 3,000 right now. But iCasualties only counts soldiers -- thousands of others have died in service to the war in Afghanistan.
When we include contractor deaths -- 2,800, according to a July 12 report in Bloomberg Government by Barry McGarry -- the number of coalition dead soars to almost 6,000.
Notably, no one compiles a comprehensive dataset of how many Afghan soldiers and policemen have been killed during the last 10 years. Wikipedia comes close, though their counting is only current as of last summer. According to this obsolete number, more than 10,000 Afghan soldiers and policemen have been killed since 2003.
By most rough estimates, about 30,000 Coalition soldiers and civilian contractors have been wounded during this same period of time. An unknown number of Afghan soldiers and policemen have been wounded as well, though we can safely assume it is in the thousands (the Special Inspector General for Afghanistan Reconstruction estimates about 3,000 were wounded between 2007 and 2011).
As for civilians, The Guardian recently estimated just over 8,000 Afghan civilians were killed in combat between 2006 and 2011. The UN estimates more than 3,000 died in 2011 alone. There are no reliable counts before then, and afterward, the U.S.-led coalition force and the UN present widely different estimates. There are no overall estimates of civilian wounded.
This framing of cost is critical to understanding why we need to explore what else has gone wrong. There is an argument to be made that 16,000 or more dead is an acceptable amount of loss over ten years of war; and that almost 10,000 dead civilians is also relatively low by historic standards. But such an argument would miss the point: while the number of dead matter (and is high no matter how you examine it) the fact that the dead keep coming, month over month, year over year, matters on its own.
This doesn't immediately help us understand what's gone wrong, but it does help us frame the discussion and get a sense of the scale of the problem. That's not a policy guideline like the sort of think you'll find in my report, if you care to read it, but it is the reason that I wrote it in the first place.
If the party cares about winning, it needs to learn how to appeal to the white working class.
The strategy was simple. A demographic wave—long-building, still-building—would carry the party to victory, and liberalism to generational advantage. The wave was inevitable, unstoppable. It would not crest for many years, and in the meantime, there would be losses—losses in the midterms and in special elections; in statehouses and in districts and counties and municipalities outside major cities. Losses in places and elections where the white vote was especially strong.
But the presidency could offset these losses. Every four years the wave would swell, receding again thereafter but coming back in the next presidential cycle, higher, higher. The strategy was simple. The presidency was everything.
Trinity Lutheran v. Comer finds that governments can’t discriminate against churches that would otherwise qualify for funding just because of their religious nature.
The Supreme Court ruled on Monday that the state of Missouri cannot deny public funds to a church simply because it is a religious organization.
Seven justices affirmed the judgment in Trinity Lutheran v. Comer, albeit with some disagreement about the reasoning behind it. The major church-state case could potentially expand the legal understanding of the free-exercise clause of the First Amendment of the U.S. Constitution. It is also the first time the Supreme Court has ruled that governments must provide money directly to a house of worship, which could have implications for future policy fights—including funding for private, religious charter schools.
Trinity Lutheran is a big case that hinges on mundane facts. In 2012, when Trinity Lutheran Church in Missouri applied for a state grant to resurface its playground, it was ranked as a strong potential candidate for the program. Ultimately, though, Missouri denied the funding under a state constitutional provision that prohibits public money from going to religious organizations and houses of worship. “There is no question that Trinity Lutheran was denied a grant simply because of what it is,” wrote Chief Justice John Roberts in his decision for the majority. “A church.”
The Supreme Court announced Monday it will review the president’s controversial executive order next term. But in the meantime, the administration can enforce some of its provisions.
The U.S. Supreme Court agreed to review a series of lower-court rulings blocking the Trump administration’s controversial travel ban on Monday, setting up a major showdown over presidential power and religious discrimination.
In an unsigned order issued on the Court’s last day before its summer recess, the justices scheduled oral arguments in the case for when they return in October. They also partially lifted the lower courts’ injunctions against Section 2(c) of President Trump’s executive order, which temporarily suspended visa applications from six Muslim-majority countries, as well as Section 6, which froze the U.S. Refugee Admissions Program and halted refugee entry into the United States.
The South Coast, a 30-mile drive from Palo Alto, is facing an affordable-housing shortage that is jeopardizing its agricultural heritage.
On the drive up the coast from the southernmost part of Northern California’s San Mateo County, Highway 1’s two lanes are surrounded by wind-whipped seas on one side and redwood forests on the other. The landscape is dotted with wild yellow mustard in the spring and pumpkins in the fall. A popular place for day-trippers to picnic, go wine-tasting, and shop at roadside farm stands, the region—affectionately nicknamed “the Slowcoast” for its unhurried pace—is a balm to the busyness nearby in Silicon Valley, to the east, and San Francisco, to the north.
Home to fewer than 3,000 people, the South Coast is the least densely populated part of the Bay Area. While it feels like a region unto itself, it is part of San Mateo County, which is where—just over the Santa Cruz Mountains—several big tech companies, such as Facebook and Oracle, are based. South of those firms’ campuses (in Santa Clara County) are the well-known tech hubs of Mountain View, Cupertino, and Palo Alto. San Mateo County is also the home of some of the wealthiest tech executives: The city of Atherton, about a 30-mile drive from the South Coast, was, according to Forbes, the country’s most expensive zip code in 2015 and the third-most expensive in 2016. The countywide median price for a single-family home reached $1.2 million last year.
Let’s first acknowledge that Gchat was never officially called Gchat. Launched in February 2006, Google named it Google Talk, refusing to refer to it by its colloquial name. For anyone mourning its demise, which the company announced in a March blog post, those names sound awkward, like they’re describing something else. To me, and to many other users, it’s Gchat, and always will be.
The brilliance of Gchat was that it allowed you to instant message any Gmail user within a web browser, instead of using a separate application. This attribute was a lifeline for those of us who, a decade ago, were online all day at our entry-level jobs in open offices, every move tracked on computers that required admin access to download new software, with supervisors who could appear behind you at any time. You could open a separate browser window or a single tab, keeping Gchat running in the background as you ostensibly worked on projects aside from the dramas of your personal life.
The GOP planned a dynastic restoration in 2016. Instead, it triggered an internal class war. Can the party reconcile the demands of its donors with the interests of its rank and file?
The angriest and most pessimistic people in America aren’t the hipster protesters who flitted in and out of Occupy Wall Street. They aren’t the hashtavists of #BlackLivesMatter. They aren’t the remnants of the American labor movement or the savvy young dreamers who confront politicians with their American accents and un-American legal status.
The angriest and most pessimistic people in America are the people we used to call Middle Americans. Middle-class and middle-aged; not rich and not poor; people who are irked when asked to press 1 for English, and who wonder how white male became an accusation rather than a description.
You can measure their pessimism in polls that ask about their expectations for their lives—and for those of their children. On both counts, whites without a college degree express the bleakest view. You can see the effects of their despair in the new statistics describing horrifying rates of suicide and substance-abuse fatality among this same group, in middle age.
The president may be overstating the gang’s impact.
As President Trump sat for Time’s Person of the Year interview last year, he excused himself and returned with a copy of Newsday. He wanted to show editor Michael Scherer a headline. “‘EXTREMELY VIOLENT’ GANG FACTION,” it read, and the article told of murders in Suffolk County, New York, all linked to MS-13. One murder was that of 16-year-old Kayla Cuevas, who’d argued with MS-13 members at her high school. The gang, many of them also teenagers, found Cuevas and a friend walking along the street and beat them with baseball bats and hacked at them with machetes. “They come from Central America,” Trump said to Scherer. “They’re tougher than any people you’ve ever met. They’re killing and raping everybody out there. They’re illegal.”
The quality and variety of food in the U.S. has never been better. The business seems to be struggling. What’s really going on?
For restaurants in America, it is the best of times, and it is the worst of times.
Last century’s dystopians imagined that mediocre fast-food chains would take overevery square inch of the country. But in cities across the U.S., residents are claiming that the local restaurant scene is in a golden age of variety and quality. I’ve heard it in Portland, Oregon, named the best food city in America by the Washington Post; in Washington, D.C., named the best food city in America by Bon Appetit; in New Orleans, where the number of restaurants grew 70 percent after Hurricane Katrina; and in San Francisco, which boasts the most restaurants per capita in the country; and in Chicago, which has added several three-Michelin-star restaurants this decade. I live in New York, which will always lead the country in sheer abundance of dining options, but after years of visiting my sister in Los Angeles, I’m thoroughly convinced that America’s culinary capital has switched coasts.
Anthony Fauci, head of the National Institute of Allergy and Infectious Diseases, on persuading anti-vaxers, predicting the next outbreak, and working with Trump.
If you run into a left-leaning “consultant” these days, there’s a fairly good chance they used to work for the Obama administration. Scores of federal officials and bureaucrats have resigned or been fired since President Trump’s inauguration, some after realizing their goals were not in line with the new president’s.
Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, wasn’t one of them. In fact, he seemed surprised at the suggestion that he might do something other than what he’s been doing since he began leading the institute in 1984—trying to protect people from diseases like Ebola, Zika, and HIV.
This is despite the fact that some of Trump’s policy proposals seem to directly contradict his efforts. Trump has proposed cutting funding for a program that provides HIV drugs to people in poor countries by 17 percent. Not long after, six members of the Presidential Advisory Council on HIV/AIDS resigned, citing "a president who simply does not care.”
Recep Tayyip Erdogan is thinking about his legacy—and his own mortality. He desires power, but not necessarily for its own sake.
Politicians—especially ideological ones—have to eventually deal with the “then what?” question. With Turkish President Recep Tayyip Erdogan’s narrow victory in a tense April referendum granting him sweeping new powers (amid opposition allegations of voter fraud), he could very well dominate the country’s politics through 2029. He would have more than a decade to reshape the country, altering the very meaning of what it means to be Turkish.
In the first decade of its rule, beginning in 2002, Erdogan’s Islamist-rooted Justice and Development Party (AKP) presided over a rapidly growing economy, pushed through liberal reforms, and sidelined a military that had undermined Turkish democracy in a series of coups over the course of six decades. Could that, though, really have been all the AKP and its fiery, erratic leader hoped to accomplish?