Weapons systems that fire autonomously are the apotheosis of elites insulating themselves from accountability.
The aversion I have to autonomous weapons dates back to a bygone afternoon when I sat playing Golden Eye, a James Bond-themed video game that I was progressing through with all the alacrity of its lead character until I reached the jungle level. Expert as I was at evading enemy soldiers, I found myself gunned down in a spray of machine-gun bullets, which turned out to be motion activated. "Oh bollocks," I cursed, determined to stay in character. "That's hardly sporting."
I suppose some of you will think I'm a nutty conspiracy theorist when I inform you that many inside the Defense Department are eager to deploy machines on the battlefield that autonomously kill rather than requiring human intervention to "pull the trigger." It sounds like dystopian fiction. But it has its champions, and most informed observers think it's inevitable that they'll win victories in coming years. There aren't even many objections to autonomous military hardware that doesn't kill: Think of drones that take off, surveil, and land entirely on autopilot. Is the day coming when drones like that are armed, programmed to seek out certain sorts of images, and automated to fire in certain circumstances? The short answer is that, given present trends, it's a realistic possibility.
Noel Sharkey is shaken and stirred -- and he isn't alone. As he recently noted in The Guardian, Human Rights Watch is so concerned that they're calling on international actors to immediately "prohibit the development, production and use of fully autonomous weapons through an international legally binding instrument; and adopt national laws and policies to prohibit the development, production, and use of fully autonomous weapons." Would that President Obama listened.
The same Guardian column flags a quotation from a recent Pentagon directive on autonomous weapons that is worth bearing in mind. The lethal machines of the future could conceivable fail due to "a number of causes," the directive states, "including but not limited to human error, human-machine interaction failures, malfunctions, communications degradation, software coding errors, enemy cyber attacks or infiltration into the industrial supply chain, jamming, spoofing, decoys, other enemy countermeasures or actions, or unanticipated situations on the battlefield."
The arguments against adopting this technology nearly make themselves. So I'll focus on a tangential but important observation.
In recent decades, America's elite -- its elected officials, bureaucrats, and CEOs, for starters -- have succeeded spectacularly at insulating themselves from responsibility for their failures. As the global economy melted down, Wall Street got bailouts. CEOs who preside over shrinking companies still depart with "golden parachute" severance packages. The foreign-policy establishment remains virtually unchanged despite the catastrophic errors so many of its members made during the Iraq War. The conservative movement is failing to advance its ideas or its political prospects, yet its institutional leaders and infotainment personalities are as profitable as ever. It is a self-evidently pernicious trend: Once someone achieves insider status, their future success is less and less dependent on how competently and responsibly they perform.
An innocent who dies at the handlessness of an automated killing machine? How easy to phrase the obligatory apology in the passive voice!
It's no wonder that some military leaders are so eager for the advent of autonomous weapons. At present, if WikiLeaks gets ahold of a video that shows innocents being fired upon, the incident in question can be traced back to an individual triggerman, an officer who gave him orders, and perhaps particular people who provided them with faulty intelligence. But an innocent who dies at the handlessness of an automated killing machine? How easy to phrase the obligatory apology in the passive voice! How implausible that any individual would be held culpable for the failure!
And would there ever be accusations that a supposed mistake was actually an intentional killing being conveniently blamed on autonomy gone wrong? I hardly think such suggestions would come from within the establishment. Levying that sort of accusation against a sitting president or the "brave men and women of the United States military" is precisely the sort of thing that bipartisan Washington, D.C., culture declares beyond the pale of reasonable discourse. And when autonomous Chinese drones "accidently" fire upon some Tibetans? Even a genuine accident in America's past would make it that much harder to refrain from giving the Chinese the benefit of the doubt that we ourselves had requested.
Most areas of American culture would benefit if the relevant set of elites were more accountable, but nowhere is in individual accountability more important than when death is being meted out. The secrecy that surrounds the national-security establishment and outsourcing drone strikes to the CIA already introduces a problematic lack of accountability to the War on Terrorism. Autonomous killing machines would exacerbate the problem more than anything else I can imagine.
Conor Friedersdorf is a staff writer at The Atlantic, where he focuses on politics and national affairs. He lives in Venice, California, and is the founding editor of The Best of Journalism, a newsletter devoted to exceptional nonfiction.
In 1985, Neil Postman observed an America imprisoned by its own need for amusement. He was, it turns out, extremely prescient.
Earlier this month, thousands of protesters gathered at Washington’s National Mall to advocate for an assortment of causes: action against global climate change, federal funding for scientific research, an empirical approach to the world and its mysteries. The protesters at the March for Science, as scientists are wont to do, followed what has become one of the formulas for such an event, holding clever signs, wearing cheeky costumes, and attempting, in general, to carnivalize their anger. “Make the Barrier Reef Great Again,” read one sign at the March. “This is my sine,” read another. “I KNEW TO WEAR THIS,” one woman had written on the poncho she wore that soggy Saturday, “BECAUSE SCIENCE PREDICTED THE RAIN.” Three protesters, sporting sensible footwear and matching Tyrannosaurus rex costumes, waved poster boards bearing messages like “Jurassick of this shit.”
Recent border battles have once again redrawn the lines of the metro area.
On the Saturday before Election Day last November, Jason Lary, a former insurance executive, crouched on a rough patch of grass at the center of a busy intersection 20 miles outside of Atlanta in DeKalb County. Lary was holding a hammer, and he tapped carefully on the thin wire base of a campaign sign. “My hand is like Fred Flintstone’s right now because I banged my hand in the night,” he said, noting his latest sign-related injury. This hazard, though, was worthwhile: “If you don’t start [the sign] with your hand, it will bend. It takes longer—guys are 10 times faster than I am. But my sign’s still gonna be up.”
This was a non-trivial advantage for Lary, who for the past month had begun most mornings with a kind of ground-game whack-a-mole. He would put up signs under the cover of night, only to have his opponents dislodge them by hand or, when that failed, run over them with their cars. Nevertheless, Lary was feeling good. “My opposition? Worn down,” he told me. “They don’t even have any more signs. And I kept a stash, knowing this time was coming. This is not my first picnic with nonsense.”
In the party’s bid to regain power, centrists and Bernie Sanders’s allies offer seemingly incompatible strategies—that target wildly different voters.
The distinctive pattern of public reaction to President Trump as he approaches the end of his first 100 days in office is sharpening the choices facing Democrats over the party’s road to recovery.
Though Trump’s agenda has unified Democrats in near-term opposition, clear fault lines have quickly emerged about the party’s long-term strategy to regain power. On one side are those—largely affiliated with Senator Bernie Sanders—arguing for a biting message of economic populism, which is intended largely to recapture working-class white voters that stampeded to Trump in 2016. On the other are party strategists who want Democrats to offer a more centrist economic message, aimed primarily at reassuring white-collar suburbanites drawn to the party mostly around cultural issues.
Who wins (the rich), who loses (anybody who doesn’t like deficits), and why it might take a miracle for the plan to become a law
There are two compelling narratives around President Donald Trump’s first 100 days. The first is his transformation from heterodox populist to orthodox Republican. Although he ran as a mold-breaking renegade, his economic policies come straight out of the conservative mold, from cutting business regulations to backing off threats to label China a currency manipulator and supporting plans to reduce health-insurance coverage for the poor.
The second story is that Trump has been more focused on optimizing for his own income and branding than for political victories. He has visited no foreign leaders, passed no major laws, given no major political addresses, and disappeared as the GOP effort to repeal Obamacare failed, all while doing little to refute accusations that he’s using the office to raise membership revenue at Mar-a-Lago and mixing business and politics in ways that are unprecedented for a sitting president.
Cases of brain-infecting amoebae underscore the importance of purifying water before you pour it into your sinuses.
Allergy season is upon us once more, and for many allergy sufferers, that means it’s time to pull two crucial items to the front of the medicine cabinet: 24-hour non-drowsy loratadine, and a neti pot—a teapot-like device used to flush the nasal passages with saline in order to clear allergens and soothe sinus pressure. (It can be seen in action in this very popular gif.) The constant mockery of my loved ones doesn’t prevent me from using a neti pot to ease my congestion. It’s too effective to give up for the sake of pride. But I have also used my neti pot with considerable apprehension since 2011.
That year, a 20-year-old man from Louisiana died of encephalitis caused by Naegleria fowleri, an amoeba commonly found in lakes and rivers in the American South—but which rarely causes infection. More unusual still was the fact that the young man hadn’t had been swimming in freshwater lakes or rivers anytime recently. Then, a few months later, a 51-year-old Louisiana woman also died of encephalitis—primary amebic meningoencephalitis (PAM) to be exact, which is the condition caused when Naegleria fowleri infects the brain. Shortly before she passed away, her doctor learned that while she hadn’t had been swimming in freshwater either, she had recently used a neti pot. Researchers later learned that the other victim had also used a neti pot, and subsequent testing found Naegleria fowleri in both patients’ brain tissue as well as the tap water in their homes. Using a neti pot had allowed the amoeba to reach their brains.
Activists threatened to drag local Republicans off a parade route if they weren’t excluded from a local celebration. Organizers cancelled the entire event in response.
On the day after Donald Trump was inaugurated, perhaps 3 million Americans took to the streets in peaceful protest to register their opposition. When news of his travel ban broke, I stood at LAX watching Angelenos sing the Star Spangled Banner and Amazing Grace. Across the nation, peaceful protest against President Trump continues. But a violent fringe has been using Trump’s rise as a justification for political violence, as if his authoritarian impulses justify authoritarianism from his opponents.
This tiny faction knows that most of their compatriots on the left are committed to nonviolence, so they frame their aggressive actions as a narrow exception to the rule.
Most famously, they insisted that it was okay, or even righteous, to punch white supremacist Richard Spencer because he was “a Nazi.” That position impels the debate down a slippery slope. And now, activists in Oregon caused the cancellation of the 82nd Avenue of Roses Parade, a community event in the southeast quadrant of Portland, by threatening to forcibly drag “fascists” off the parade route if they weren’t excluded.
“Somewhere at Google there is a database containing 25 million books and nobody is allowed to read them.”
You were going to get one-click access to the full text of nearly every book that’s ever been published. Books still in print you’d have to pay for, but everything else—a collection slated to grow larger than the holdings at the Library of Congress, Harvard, the University of Michigan, at any of the great national libraries of Europe—would have been available for free at terminals that were going to be placed in every local library that wanted one.
At the terminal you were going to be able to search tens of millions of books and read every page of any book you found. You’d be able to highlight passages and make annotations and share them; for the first time, you’d be able to pinpoint an idea somewhere inside the vastness of the printed record, and send somebody straight to it with a link. Books would become as instantly available, searchable, copy-pasteable—as alive in the digital world—as web pages.
Will you pay more for those shoes before 7 p.m.? Would the price tag be different if you lived in the suburbs? Standard prices and simple discounts are giving way to far more exotic strategies, designed to extract every last dollar from the consumer.
As Christmas approached in 2015, the price of pumpkin-pie spice went wild. It didn’t soar, as an economics textbook might suggest. Nor did it crash. It just started vibrating between two quantum states. Amazon’s price for a one-ounce jar was either $4.49 or $8.99, depending on when you looked. Nearly a year later, as Thanksgiving 2016 approached, the price again began whipsawing between two different points, this time $3.36 and $4.69.
We live in the age of the variable airfare, the surge-priced ride, the pay-what-you-want Radiohead album, and other novel price developments. But what was this? Some weird computer glitch? More like a deliberate glitch, it seems. “It’s most likely a strategy to get more data and test the right price,” Guru Hariharan explained, after I had sketched the pattern on a whiteboard.
Ulrich Baer’s op-ed in The New York Times is the latest challenge to liberal speech norms that fails to withstand close scrutiny.
Earlier this week, Ulrich Baer, a vice provost at New York University, published an op-ed in The New York Timesdefending student-activist efforts to shut down speakers at institutions of higher education like Auburn, UC Berkeley, and Middlebury. He urged readers inclined to defend liberal norms on matters of speech to adopt “a more sophisticated understanding” and argued that “the parameters of public speech must be continually redrawn to accommodate those who previously had no standing.”
Were there “parameters of speech” at Berkeley 10 or 15 years ago that denied standing to students who have it today? What were the parameters? Who are the students?
The op-ed is elusive throughout in a manner typical of university administrators with censorious instincts. Many words are lavished on a questionably relevant anecdote about the Holocaust and the obligatory theory of a postmodern French philosopher. Very few words clarify what speech is to be suppressed by what standards, or who is to decide if they are met, as if we needn’t worry overmuch about limiting principles or the abuses that invariably follow when they are absent—even though marginalized groups typically bear the attendant burdens most heavily.
The author of a new book explains the science behind the cringeworthy feeling—and how to overcome it.
It’s when a fist bump unwittingly meets a high-five. It’s when Ben Carson tries, unsuccessfully, to walk onto a stage. It’s trying to introduce an acquaintance to someone else at a party and then realizing you don’t actually remember their name. It’s awkward, and like so many other things, you know it when you see it.
We all experience awkwardness, of course, but some people seem chronically susceptible to it. In his new book, the appropriately titled Awkward, the writer and psychologist Ty Tashiro explores why certain people seem more prone to these cringe-inducing moments, and what they can do about it. I recently interviewed Tashiro; an edited transcript of our conversation follows.
Olga Khazan: Do you consider yourself awkward? What are some of the awkward things you do or used to do?