The U.N. Will Not Stand for Killer Robots

At a the meeting of the Human Rights Council in Geneva on Thursday, a top U.N. official on execution gave the world his best Sarah Connor impression, urging for a moratorium on Lethal Autonomous Robotics, a warning he hopes will stop a future of armed assassins on the battlefield that may be past the point of no return.

This article is from the archive of our partner .
President Obama may have finally clarified the U.S. position on armed assassins in the sky, but the next wave of drone controversy may now center on whether robots on the field of battle are smart enough to gun down human beings. At a the meeting of the United Nations Human Rights Council in Geneva on Thursday, a top U.N. official on execution gave the world his best Sarah Connor impression, urging for a moratorium on terminators Lethal Autonomous Robotics (LARs), a warning he hopes will stop a future of killer robots that may be past the point of no return if leading military technologists have anything to say about it. "War without reflection is mechanical slaughter.... A decision to allow machines to be deployed to kill human beings worldwide — whatever weapons they use — deserves a collective pause," said Heyns, the U.N.'s special rapporteur on extrajudicial, summary or arbitrary executions. That is one fancy title, but his message is simple, familiar, and likely in vain: Many advocates would still rather trust a human to pull a trigger than leave it to SkyNet, or, well, a machine set to autopilot by the U.S., Israeli, British, or Korean military.

But, yes, the the United Nations listened to debate about killer robots. Thursday's session came just three weeks after the United States Congress conducted a hearing about other Earths because, well, the line between reality and science fiction are closing in fast enough for the world to truly weigh in. Currently, there are no fully autonomous and armed robots in action — early attempts have gone awry, and while the Pentagon has not been shy in wanting to develop stand-alone shooters, they've insisted by official policy that a human being will always be "in the loop." Human Rights Watch and Harvard Law School, gleaning information from the U.S. Air Force, have reported that "by 2030 machine capabilities will have increased to the point that humans have become the weakest component in a wide array of systems and processes." So, by the time Suri Cruise is 24, humans really starte to be the weakest links on the battlefield. In the meantime, a few superpowers and would-be-superpowers are building up their LAR arsenals. Here are some of the standouts currently in question:

The U.S. Phalanx can detect, track, and fire upon anti-ship and anti-air threats:

Israel's Harpy, as the AP reports, is a "Fire-and-forget autonomous weapon system designed to detect, attack and destroy radar emitters."

Britain's Taranis is a semi-autonomous stealth drone that can "think for itself," according to Sky News:

Korea's Techwin​ surveillance system and robots can detect targets through infrared. They're "operated by humans but have an 'automatic mode,'" the AP reports.

The robotic capabilities of China and Russia aren't as well known, as Nick Cumming-Bruce reports today at The New York Times.

So, we're still a little ways off from terminators, but the argument from the human rights community is more than relevant today: If we're killing people with drones being flown across the skies of Yemen and Saudi Arabia and Pakistan by pilots thousands of miles away, what happens to killing — and the human decision of war, and war crimes — when we have a "set it and forget it" robot roaming the Earth? The moratorium proposed on Thursday by Heyns, the U.N. official, would put a halt "on the production, assembly, transfer, acquisition, deployment and use of LARs, until a framework on the future of LARs has been established." He also asked the U.N. council to set up a high-level panel on LARs that would convene within a year to "to assess whether existing international laws are adequate for controlling their use," Cumming-Bruce reported.

Robots, of course, have their fans. Advocates for autonomous military helpers and/or overlords insist that our humanoid friends "process information faster than humans, and they are not subject to fear, panic, a desire for revenge or other emotions that can cloud human judgment," according to the Times. Whether humans get their revenge on robots before they can kill us, well, judgment day hasn't come yet.

This article is from the archive of our partner The Wire.