Weapons systems that fire autonomously are the apotheosis of elites insulating themselves from accountability.
The aversion I have to autonomous weapons dates back to a bygone afternoon when I sat playing Golden Eye, a James Bond-themed video game that I was progressing through with all the alacrity of its lead character until I reached the jungle level. Expert as I was at evading enemy soldiers, I found myself gunned down in a spray of machine-gun bullets, which turned out to be motion activated. "Oh bollocks," I cursed, determined to stay in character. "That's hardly sporting."
I suppose some of you will think I'm a nutty conspiracy theorist when I inform you that many inside the Defense Department are eager to deploy machines on the battlefield that autonomously kill rather than requiring human intervention to "pull the trigger." It sounds like dystopian fiction. But it has its champions, and most informed observers think it's inevitable that they'll win victories in coming years. There aren't even many objections to autonomous military hardware that doesn't kill: Think of drones that take off, surveil, and land entirely on autopilot. Is the day coming when drones like that are armed, programmed to seek out certain sorts of images, and automated to fire in certain circumstances? The short answer is that, given present trends, it's a realistic possibility.
Noel Sharkey is shaken and stirred -- and he isn't alone. As he recently noted in The Guardian, Human Rights Watch is so concerned that they're calling on international actors to immediately "prohibit the development, production and use of fully autonomous weapons through an international legally binding instrument; and adopt national laws and policies to prohibit the development, production, and use of fully autonomous weapons." Would that President Obama listened.
The same Guardian column flags a quotation from a recent Pentagon directive on autonomous weapons that is worth bearing in mind. The lethal machines of the future could conceivable fail due to "a number of causes," the directive states, "including but not limited to human error, human-machine interaction failures, malfunctions, communications degradation, software coding errors, enemy cyber attacks or infiltration into the industrial supply chain, jamming, spoofing, decoys, other enemy countermeasures or actions, or unanticipated situations on the battlefield."
The arguments against adopting this technology nearly make themselves. So I'll focus on a tangential but important observation.
In recent decades, America's elite -- its elected officials, bureaucrats, and CEOs, for starters -- have succeeded spectacularly at insulating themselves from responsibility for their failures. As the global economy melted down, Wall Street got bailouts. CEOs who preside over shrinking companies still depart with "golden parachute" severance packages. The foreign-policy establishment remains virtually unchanged despite the catastrophic errors so many of its members made during the Iraq War. The conservative movement is failing to advance its ideas or its political prospects, yet its institutional leaders and infotainment personalities are as profitable as ever. It is a self-evidently pernicious trend: Once someone achieves insider status, their future success is less and less dependent on how competently and responsibly they perform.
An innocent who dies at the handlessness of an automated killing machine? How easy to phrase the obligatory apology in the passive voice!
It's no wonder that some military leaders are so eager for the advent of autonomous weapons. At present, if WikiLeaks gets ahold of a video that shows innocents being fired upon, the incident in question can be traced back to an individual triggerman, an officer who gave him orders, and perhaps particular people who provided them with faulty intelligence. But an innocent who dies at the handlessness of an automated killing machine? How easy to phrase the obligatory apology in the passive voice! How implausible that any individual would be held culpable for the failure!
And would there ever be accusations that a supposed mistake was actually an intentional killing being conveniently blamed on autonomy gone wrong? I hardly think such suggestions would come from within the establishment. Levying that sort of accusation against a sitting president or the "brave men and women of the United States military" is precisely the sort of thing that bipartisan Washington, D.C., culture declares beyond the pale of reasonable discourse. And when autonomous Chinese drones "accidently" fire upon some Tibetans? Even a genuine accident in America's past would make it that much harder to refrain from giving the Chinese the benefit of the doubt that we ourselves had requested.
Most areas of American culture would benefit if the relevant set of elites were more accountable, but nowhere is in individual accountability more important than when death is being meted out. The secrecy that surrounds the national-security establishment and outsourcing drone strikes to the CIA already introduces a problematic lack of accountability to the War on Terrorism. Autonomous killing machines would exacerbate the problem more than anything else I can imagine.
Conor Friedersdorf is a staff writer at The Atlantic, where he focuses on politics and national affairs. He lives in Venice, California, and is the founding editor of The Best of Journalism, a newsletter devoted to exceptional nonfiction.
She lived with us for 56 years. She raised me and my siblings without pay. I was 11, a typical American kid, before I realized who she was.
The ashes filled a black plastic box about the size of a toaster. It weighed three and a half pounds. I put it in a canvas tote bag and packed it in my suitcase this past July for the transpacific flight to Manila. From there I would travel by car to a rural village. When I arrived, I would hand over all that was left of the woman who had spent 56 years as a slave in my family’s household.
The condition has long been considered untreatable. Experts can spot it in a child as young as 3 or 4. But a new clinical approach offers hope.
This is a good day, Samantha tells me: 10 on a scale of 10. We’re sitting in a conference room at the San Marcos Treatment Center, just south of Austin, Texas, a space that has witnessed countless difficult conversations between troubled children, their worried parents, and clinical therapists. But today promises unalloyed joy. Samantha’s mother is visiting from Idaho, as she does every six weeks, which means lunch off campus and an excursion to Target. The girl needs supplies: new jeans, yoga pants, nail polish.
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.
At 11, Samantha is just over 5 feet tall and has wavy black hair and a steady gaze. She flashes a smile when I ask about her favorite subject (history), and grimaces when I ask about her least favorite (math). She seems poised and cheerful, a normal preteen. But when we steer into uncomfortable territory—the events that led her to this juvenile-treatment facility nearly 2,000 miles from her family—Samantha hesitates and looks down at her hands. “I wanted the whole world to myself,” she says. “So I made a whole entire book about how to hurt people.”
U.K. police said at least 22 people are dead and 59 injured following the incident at Manchester Arena.
Here’s what we know:
—Greater Manchester Police said 22 people are dead and 59 injured following reports of an explosion at the Manchester Arena.
—Authorities are treating the explosion as a terrorist attack, believing the incident to be carried out by a lone male. The attacker, who reportedly detonated an explosive device, is said to have died at the arena.
—The venue was the scene of an Ariana Grande concert. British Transport Police said there were “reports of an explosion within the foyer area of the stadium” at 10.35 p.m. local time, but Manchester Arena said the incident occurred “outside the venue in a public place.”
—This is a developing story and we’ll be following it here. All updates are in Eastern Standard Time (GMT -4).
Isabel Caliva and her husband, Frank, had already “kicked the can down the road.” The can, in their case, was the kid conversation; the road was Caliva’s fertile years. Frank had always said he wanted lots of kids. Caliva, who was in her early 30s, thought maybe one or two would be nice, but she was mostly undecided. They had a nice life, with plenty of free time that allowed for trips to Portugal, Paris, and Hawaii.
“I wasn’t feeling the pull the same way my friends were describing,” she told me recently. “I thought, maybe this isn’t gonna be the thing for me. Maybe it’s just going to be the two of us.”
At times, she wondered if her lack of baby fever should be cause for concern. She took her worries to the Internet, where she came across a post on the Rumpus’ “Dear Sugar” advice column titled, “The Ghost Ship that Didn’t Carry Us.” The letter was from a 41-year-old man who was also on the fence about kids: “Things like quiet, free time, spontaneous travel, pockets of non-obligation,” he wrote. “I really value them.”
For 15 years, the animation studio was the best on the planet. Then Disney bought it.
A well-regarded Hollywood insider recently suggested that sequels can represent “a sort of creative bankruptcy.” He was discussing Pixar, the legendary animation studio, and its avowed distaste for cheap spin-offs. More pointedly, he argued that if Pixar were only to make sequels, it would “wither and die.” Now, all kinds of industry experts say all kinds of things. But it is surely relevant that these observations were made by Ed Catmull, the president of Pixar, in his best-selling 2014 business-leadership book.
Yet here comes Cars 3, rolling into a theater near you this month. You may recall that the original Cars, released back in 2006, was widely judged to be the studio’s worst film to date. Cars 2, which followed five years later, was panned as even worse. And if Cars 3 isn’t disheartening enough, two of the three Pixar films in line after it are also sequels: The Incredibles 2 and (say it isn’t so!) Toy Story 4.
The office was, until a few decades ago, the last stronghold of fashion formality. Silicon Valley changed that.
Americans began the 20th century in bustles and bowler hats and ended it in velour sweatsuits and flannel shirts—the most radical shift in dress standards in human history. At the center of this sartorial revolution was business casual, a genre of dress that broke the last bastion of formality—office attire—to redefine the American wardrobe.
Born in Silicon Valley in the early 1980s, business casual consists of khaki pants, sensible shoes, and button-down collared shirts. By the time it was mainstream, in the 1990s, it flummoxed HR managers and employees alike. “Welcome to the confusing world of business casual,” declared a fashion writer for the Chicago Tribune in 1995. With time and some coaching, people caught on. Today, though, the term “business casual” is nearly obsolete for describing the clothing of a workforce that includes many who work from home in yoga pants, put on a clean T-shirt for a Skype meeting, and don’t always go into the office.
“Having a slave gave me grave doubts about what kind of people we were, what kind of place we came from,” Alex Tizon wrote in his Atlantic essay “My Family’s Slave.”
A thousand objections can be leveled against that piece, and in the few days since it was published, those objections have materialized from all quarters. It’s a powerful story, and its flaws and omissions have their own eloquence. For me, the most important failure is that Tizon seems to attribute Lola’s abuse entirely to another culture—specifically, to a system of servitude in the Philippines—as though he believes, This doesn’t happen in America. But that system is not only in America, it’s everywhere. It ensnares not only immigrants, but everyone.
New Orleans Mayor Mitch Landrieu explains to his city why four monuments commemorating the Lost Cause and the Confederacy had to come down.
Last week, the City of New Orleans finished removing four monuments—to Confederate President Jefferson Davis, Generals P.G.T. Beauregard and Robert E. Lee, and the postwar battle of Liberty Place. The removals occasioned threats, protests, and celebrations. On Friday, Mayor Mitch Landrieu explained to his city why he had concluded that the monuments needed to come down.
The soul of our beloved City is deeply rooted in a history that has evolved over thousands of years; rooted in a diverse people who have been here together every step of the way—for both good and for ill.
An anthropologist discusses some common misconceptions about female genital cutting, including the idea that men force women to undergo the procedure.
I recently had a conversation that challenged what I thought I knew about the controversial ritual known as “female genital cutting,” or, more commonly, "female genital mutilation."
FGC, as it is abbreviated, involves an elder or other community member slicing off all or part of a woman’s clitoris and labia as part of a ceremony that is often conducted around the time that the woman reaches puberty. Many international groups are concerned about FGC, which is practiced extensively in parts of Africa and the Middle East and is linked to infections, infertility, and childbirth complications.
Organizations such as the United Nations have campaigned against the practice, calling for its abolition as a matter of global health and human rights. But despite a decades-old movement against it, FGC rates in some countries haven't budged. While younger women are increasingly going uncut in countries such as Nigeria and the Central African Republic, according to a survey by the Population Reference Bureau, in Egypt more than 80 percent of teenagers still undergo the procedure.