Weapons systems that fire autonomously are the apotheosis of elites insulating themselves from accountability.
The aversion I have to autonomous weapons dates back to a bygone afternoon when I sat playing Golden Eye, a James Bond-themed video game that I was progressing through with all the alacrity of its lead character until I reached the jungle level. Expert as I was at evading enemy soldiers, I found myself gunned down in a spray of machine-gun bullets, which turned out to be motion activated. "Oh bollocks," I cursed, determined to stay in character. "That's hardly sporting."
I suppose some of you will think I'm a nutty conspiracy theorist when I inform you that many inside the Defense Department are eager to deploy machines on the battlefield that autonomously kill rather than requiring human intervention to "pull the trigger." It sounds like dystopian fiction. But it has its champions, and most informed observers think it's inevitable that they'll win victories in coming years. There aren't even many objections to autonomous military hardware that doesn't kill: Think of drones that take off, surveil, and land entirely on autopilot. Is the day coming when drones like that are armed, programmed to seek out certain sorts of images, and automated to fire in certain circumstances? The short answer is that, given present trends, it's a realistic possibility.
Noel Sharkey is shaken and stirred -- and he isn't alone. As he recently noted in The Guardian, Human Rights Watch is so concerned that they're calling on international actors to immediately "prohibit the development, production and use of fully autonomous weapons through an international legally binding instrument; and adopt national laws and policies to prohibit the development, production, and use of fully autonomous weapons." Would that President Obama listened.
The same Guardian column flags a quotation from a recent Pentagon directive on autonomous weapons that is worth bearing in mind. The lethal machines of the future could conceivable fail due to "a number of causes," the directive states, "including but not limited to human error, human-machine interaction failures, malfunctions, communications degradation, software coding errors, enemy cyber attacks or infiltration into the industrial supply chain, jamming, spoofing, decoys, other enemy countermeasures or actions, or unanticipated situations on the battlefield."
The arguments against adopting this technology nearly make themselves. So I'll focus on a tangential but important observation.
In recent decades, America's elite -- its elected officials, bureaucrats, and CEOs, for starters -- have succeeded spectacularly at insulating themselves from responsibility for their failures. As the global economy melted down, Wall Street got bailouts. CEOs who preside over shrinking companies still depart with "golden parachute" severance packages. The foreign-policy establishment remains virtually unchanged despite the catastrophic errors so many of its members made during the Iraq War. The conservative movement is failing to advance its ideas or its political prospects, yet its institutional leaders and infotainment personalities are as profitable as ever. It is a self-evidently pernicious trend: Once someone achieves insider status, their future success is less and less dependent on how competently and responsibly they perform.
An innocent who dies at the handlessness of an automated killing machine? How easy to phrase the obligatory apology in the passive voice!
It's no wonder that some military leaders are so eager for the advent of autonomous weapons. At present, if WikiLeaks gets ahold of a video that shows innocents being fired upon, the incident in question can be traced back to an individual triggerman, an officer who gave him orders, and perhaps particular people who provided them with faulty intelligence. But an innocent who dies at the handlessness of an automated killing machine? How easy to phrase the obligatory apology in the passive voice! How implausible that any individual would be held culpable for the failure!
And would there ever be accusations that a supposed mistake was actually an intentional killing being conveniently blamed on autonomy gone wrong? I hardly think such suggestions would come from within the establishment. Levying that sort of accusation against a sitting president or the "brave men and women of the United States military" is precisely the sort of thing that bipartisan Washington, D.C., culture declares beyond the pale of reasonable discourse. And when autonomous Chinese drones "accidently" fire upon some Tibetans? Even a genuine accident in America's past would make it that much harder to refrain from giving the Chinese the benefit of the doubt that we ourselves had requested.
Most areas of American culture would benefit if the relevant set of elites were more accountable, but nowhere is in individual accountability more important than when death is being meted out. The secrecy that surrounds the national-security establishment and outsourcing drone strikes to the CIA already introduces a problematic lack of accountability to the War on Terrorism. Autonomous killing machines would exacerbate the problem more than anything else I can imagine.
Conor Friedersdorf is a staff writer at The Atlantic, where he focuses on politics and national affairs. He lives in Venice, California, and is the founding editor of The Best of Journalism, a newsletter devoted to exceptional nonfiction.
The president touched off a brief firestorm with the unfounded charge, but real answers about why four service members were killed in Niger remain elusive.
On October 4, four American Special Forces soldiers were killed during an operation in Niger. Since then, the White House has been notably tight-lipped about the incident. During a press conference Monday afternoon, 12 days after the deaths, President Trump finally made his first public comments, but the remarks—in which he admitted he had not yet spoken with the families and briefly attacked Barack Obama—did little to clarify what happened or why the soldiers were in Niger.
Trump spoke at the White House after a meeting with Senate Majority Leader Mitch McConnell, and was asked why he hadn’t spoken about deaths of Sergeant La David Johnson and Staff Sergeants Bryan Black, Dustin Wright, and Jeremiah Johnson.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
And there could be far-reaching consequences for the national economy too.
Four floors above a dull cinder-block lobby in a nondescript building at the Ohio State University, the doors of a slow-moving elevator open on an unexpectedly futuristic 10,000-square-foot laboratory bristling with technology. It’s a reveal reminiscent of a James Bond movie. In fact, the researchers who run this year-old, $750,000 lab at OSU’s Spine Research Institute resort often to Hollywood comparisons.
Thin beams of blue light shoot from 36 of the same kind of infrared motion cameras used to create lifelike characters for films like Avatar. In this case, the researchers are studying the movements of a volunteer fitted with sensors that track his skeleton and muscles as he bends and lifts. Among other things, they say, their work could lead to the kind of robotic exoskeletons imagined in the movie Aliens.
About 10 years ago, after I’d graduated college but when I was still waitressing full-time, I attended an empowerment seminar. It was the kind of nebulous weekend-long event sold as helping people discover their dreams and unburden themselves from past trauma through honesty exercises and the encouragement to “be present.” But there was one moment I’ve never forgotten. The group leader, a man in his 40s, asked anyone in the room of 200 or so people who’d been sexually or physically abused to raise their hands. Six or seven hands tentatively went up. The leader instructed us to close our eyes, and asked the question again. Then he told us to open our eyes. Almost every hand in the room was raised.
The two big headlines, pulling the plug on subsidies in Obamacare insurance markets and tossing the Iran nuclear deal to Congress, are both highly fraught. Yet with these two decisions, President Trump has brought himself closer to following through on major campaign promises than nearly anything else he has done as president.
There are two notable things about the moves. First, they are both incomplete. President Trump has neither repealed and replaced Obamacare, nor has he shredded the Iran deal. Second, they have real potential downsides. Ending the Obamacare subsidies could end with millions of people losing their health insurance, a disaster both moral and, potentially, political. And decertifying the Iran deal could allow it to build nuclear weapons, and undermine American credibility in the Middle East and beyond for decades to come. Taken together, though, they show how Trump’s accomplishments at this stage in his presidency are almost entirely destructive, rather than constructive. Trump made his reputation as a builder, but he’s made demolition his mode in the White House.
In the media world, as in so many other realms, there is a sharp discontinuity in the timeline: before the 2016 election, and after.
Things we thought we understood—narratives, data, software, news events—have had to be reinterpreted in light of Donald Trump’s surprising win as well as the continuing questions about the role that misinformation and disinformation played in his election.
Tech journalists covering Facebook had a duty to cover what was happening before, during, and after the election. Reporters tried to see past their often liberal political orientations and the unprecedented actions of Donald Trump to see how 2016 was playing out on the internet. Every component of the chaotic digital campaign has been reported on, here at The Atlantic, and elsewhere: Facebook’s enormous distribution power for political information, rapacious partisanship reinforced by distinct media information spheres, the increasing scourge of “viral” hoaxes and other kinds of misinformation that could propagate through those networks, and the Russian information ops agency.
The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.
It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.
Four decades ago Jimmy Carter was sworn in as the 39th president of the U.S., the original Star Wars movie was released in theaters, and much more.
Four decades ago Jimmy Carter was sworn in as the 39th president of the United States, the original Star Wars movie was released in theaters, the Trans-Alaska pipeline pumped its first barrels of oil, New York City suffered a massive blackout, Radio Shack introduced its new TRS-80 Micro Computer, Grace Jones was a disco queen, the Brazilian soccer star Pele played his “sayonara” game in Japan, and much more. Take a step into a visual time capsule now, for a brief look at the year 1977.
The Earned Income Tax Credit is one of the country’s most effective anti-poverty policies, but it mostly leaves out a huge segment of workers: those without children.
As the Trump Administration and Congressional Republicans attempt to portray a tax plan slanted to the top 1 percent as “middle-class” tax relief, it’s worth asking what actual tax relief for American workers would look like. Among the ideas that should be at the top of the list should be expanding the Earned Income Tax Credit (EITC), a policy that provides millions of low-income American workers with up to a few thousand dollars when they file their taxes.
Just over 24 years ago, I was proud to be part of President Clinton’s effort to expand the EITC for families with two or more children. This expansion was a step toward fulfilling his campaign pledge that parents who worked full time should not have to raise their children in poverty. But the 1993 EITC expansion went beyond working parents, including smaller, but important, innovation: For the first time, the tax break would reach so-called “childless workers.”