Weapons systems that fire autonomously are the apotheosis of elites insulating themselves from accountability.
The aversion I have to autonomous weapons dates back to a bygone afternoon when I sat playing Golden Eye, a James Bond-themed video game that I was progressing through with all the alacrity of its lead character until I reached the jungle level. Expert as I was at evading enemy soldiers, I found myself gunned down in a spray of machine-gun bullets, which turned out to be motion activated. "Oh bollocks," I cursed, determined to stay in character. "That's hardly sporting."
I suppose some of you will think I'm a nutty conspiracy theorist when I inform you that many inside the Defense Department are eager to deploy machines on the battlefield that autonomously kill rather than requiring human intervention to "pull the trigger." It sounds like dystopian fiction. But it has its champions, and most informed observers think it's inevitable that they'll win victories in coming years. There aren't even many objections to autonomous military hardware that doesn't kill: Think of drones that take off, surveil, and land entirely on autopilot. Is the day coming when drones like that are armed, programmed to seek out certain sorts of images, and automated to fire in certain circumstances? The short answer is that, given present trends, it's a realistic possibility.
Noel Sharkey is shaken and stirred -- and he isn't alone. As he recently noted in The Guardian, Human Rights Watch is so concerned that they're calling on international actors to immediately "prohibit the development, production and use of fully autonomous weapons through an international legally binding instrument; and adopt national laws and policies to prohibit the development, production, and use of fully autonomous weapons." Would that President Obama listened.
The same Guardian column flags a quotation from a recent Pentagon directive on autonomous weapons that is worth bearing in mind. The lethal machines of the future could conceivable fail due to "a number of causes," the directive states, "including but not limited to human error, human-machine interaction failures, malfunctions, communications degradation, software coding errors, enemy cyber attacks or infiltration into the industrial supply chain, jamming, spoofing, decoys, other enemy countermeasures or actions, or unanticipated situations on the battlefield."
The arguments against adopting this technology nearly make themselves. So I'll focus on a tangential but important observation.
In recent decades, America's elite -- its elected officials, bureaucrats, and CEOs, for starters -- have succeeded spectacularly at insulating themselves from responsibility for their failures. As the global economy melted down, Wall Street got bailouts. CEOs who preside over shrinking companies still depart with "golden parachute" severance packages. The foreign-policy establishment remains virtually unchanged despite the catastrophic errors so many of its members made during the Iraq War. The conservative movement is failing to advance its ideas or its political prospects, yet its institutional leaders and infotainment personalities are as profitable as ever. It is a self-evidently pernicious trend: Once someone achieves insider status, their future success is less and less dependent on how competently and responsibly they perform.
An innocent who dies at the handlessness of an automated killing machine? How easy to phrase the obligatory apology in the passive voice!
It's no wonder that some military leaders are so eager for the advent of autonomous weapons. At present, if WikiLeaks gets ahold of a video that shows innocents being fired upon, the incident in question can be traced back to an individual triggerman, an officer who gave him orders, and perhaps particular people who provided them with faulty intelligence. But an innocent who dies at the handlessness of an automated killing machine? How easy to phrase the obligatory apology in the passive voice! How implausible that any individual would be held culpable for the failure!
And would there ever be accusations that a supposed mistake was actually an intentional killing being conveniently blamed on autonomy gone wrong? I hardly think such suggestions would come from within the establishment. Levying that sort of accusation against a sitting president or the "brave men and women of the United States military" is precisely the sort of thing that bipartisan Washington, D.C., culture declares beyond the pale of reasonable discourse. And when autonomous Chinese drones "accidently" fire upon some Tibetans? Even a genuine accident in America's past would make it that much harder to refrain from giving the Chinese the benefit of the doubt that we ourselves had requested.
Most areas of American culture would benefit if the relevant set of elites were more accountable, but nowhere is in individual accountability more important than when death is being meted out. The secrecy that surrounds the national-security establishment and outsourcing drone strikes to the CIA already introduces a problematic lack of accountability to the War on Terrorism. Autonomous killing machines would exacerbate the problem more than anything else I can imagine.
Conor Friedersdorf is a staff writer at The Atlantic, where he focuses on politics and national affairs. He lives in Venice, California, and is the founding editor of The Best of Journalism, a newsletter devoted to exceptional nonfiction.
As Trump considers military options, he’s drawing unenforceable red lines.
Speaking before the UN General Assembly today, President Donald Trump announced that, unless North Korea gives up its nuclear weapons and ballistic missile programs, “the United States will have no choice but to totally destroy” the country. He sounded almost excited as he threatened, “Rocket Man is on a suicide mission for himself and for his regime.”
North Korea is a serious problem, and not one of Trump’s making—the last four American presidents failed to impede North Korea’s progress towards a nuclear weapon. President George H.W. Bush took unilateral action, removing U.S. nuclear weapons and reducing America’s troop levels in the region, hoping to incentivize good behavior; Presidents Bill Clinton and George W. Bush tried to negotiate restrictions; President Barack Obama mostly averted his eyes. North Korea defied them all.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
The foundation of Donald Trump’s presidency is the negation of Barack Obama’s legacy.
It is insufficient to statethe obvious of Donald Trump: that he is a white man who would not be president were it not for this fact. With one immediate exception, Trump’s predecessors made their way to high office through the passive power of whiteness—that bloody heirloom which cannot ensure mastery of all events but can conjure a tailwind for most of them. Land theft and human plunder cleared the grounds for Trump’s forefathers and barred others from it. Once upon the field, these men became soldiers, statesmen, and scholars; held court in Paris; presided at Princeton; advanced into the Wilderness and then into the White House. Their individual triumphs made this exclusive party seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.
Its faith-based 12-step program dominates treatment in the United States. But researchers have debunked central tenets of AA doctrine and found dozens of other treatments more effective.
J.G. is a lawyer in his early 30s. He’s a fast talker and has the lean, sinewy build of a distance runner. His choice of profession seems preordained, as he speaks in fully formed paragraphs, his thoughts organized by topic sentences. He’s also a worrier—a big one—who for years used alcohol to soothe his anxiety.
J.G. started drinking at 15, when he and a friend experimented in his parents’ liquor cabinet. He favored gin and whiskey but drank whatever he thought his parents would miss the least. He discovered beer, too, and loved the earthy, bitter taste on his tongue when he took his first cold sip.
His drinking increased through college and into law school. He could, and occasionally did, pull back, going cold turkey for weeks at a time. But nothing quieted his anxious mind like booze, and when he didn’t drink, he didn’t sleep. After four or six weeks dry, he’d be back at the liquor store.
Old French Canadian genealogy records reveal how a harmful mutation can hide from natural selection in a mother's DNA.
The first King’s Daughters—or filles du roi—arrived in New France in 1663, and 800 more would follow over the next decade. Given their numbers, they were not literally the king’s daughters of course.
They were poor and usually of common birth, but their passage and dowry were indeed paid by King Louis XIV for the purpose of empire building: These women were to marry male colonists and have many children, thus strengthening France’s hold on North America.
And so they did. The filles du roi became the founding mothers of French Canadians, for whom these women are a source of historical pride. A grand old restaurant in Montreal was named after the filles du roi. So is a roller-derby team. French Canadians can usually trace their ancestry back to one or more of these women. “French Canadian genealogy is so well documented, it’s just a piece of cake to trace any line you have,” says Susan Colby, a retired archaeologist who comes from a French Canadian family and has done some of that tracing herself.
What was it like inside the brain of an ancient prophet?
James Kugel has been spent his entire scholarly career studying the Bible, but some very basic questions about it still obsess him. What was it about the minds of ancient Israelites that allowed them to hear and see God directly—or at least, to believe that they did? Were the biblical prophets literally hearing voices and seeing visions, understanding themselves to be transmitting God’s own exact words? If so, why did such direct encounters with God become rarer over time?
In his new and final book, The Great Shift, Kugel investigates these questions through the lens of neuroscientific findings. (The approach is reminiscent of other recent books, like Kabbalah: A Neurocognitive Approach to Mystical Experiences, co-written by a neurologist and a mysticism scholar.) First, Kugel uses biblical research to show that ancient people had a “sense of self” that was fundamentally different from the one modern Westerners have—and that this enabled them to experience and interpret prophecy differently than we do. Then he uses scientific research to show that we shouldn’t assume their view was wrong. If anything, our modern Western notion of the bounded, individual self is the anomaly; most human beings throughout history conceived of the self as a porous entity open to intrusions. In fact, much of the rest of the world today still does.
The gynecological device may have an ethically fraught history, but it's hard to improve on the design.
Few women enjoy pelvic exams: the crinkly paper dress, the awkward questions, the stirrups, the vague fear that can comes with doctors’s visits of any kind (what if they find something abnormal, something bad, something cancerous?). But perhaps no piece of the pelvic exam is as reviled as the vaginal speculum—the cold, clicking, duck-billed apparatus that lifts and separates the vaginal walls so a near-stranger can peer inside.
The speculum’s history is, like many medical histories, full of dubious ethics. Versions of the speculum have been found in medical texts dating back to the Greek physician Galen in 130 A.D. and shown up in archaeological digs as far back as 79 A.D. amidst the dust of Pompeii. (The artifact from Pompeii is a bit of a nightmare: two blades that open and close via a corkscrew-like mechanism.)
Donald Trump used his first address at the United Nations to redefine the idea of sovereignty.
Donald Trump’s first speech to the United Nations can best be understood as a response to his predecessor’s final one. On September 20, 2016, Barack Obama told the UN General Assembly that “at this moment we all face a choice. We can choose to press forward with a better model of cooperation and integration. Or we can retreat into a world sharply divided, and ultimately in conflict, along age-old lines of nation and tribe and race and religion.”
Three hundred and sixty-four days later, Trump delivered America’s answer: Option number two. His speech on Tuesday turned Obama’s on its head. Obama focused on overcoming the various challenges—poverty, economic dislocation, bigotry, extremism—that impede global “integration,” a term he used nine times. Trump didn’t use the term once. Obama used the word “international” 14 times, always positively (“international norms,” “international cooperation,” “international rules,” “international community”). Trump used it three times, in each case negatively (“unaccountable international tribunals,” “international criminal networks,” “the assassination of the dictator's brother using banned nerve agents in an international airport”) Obama warned of a world “sharply divided… along age-old lines of nation and tribe and race and religion.” Trump replied by praising “sovereignty” or invoking “sovereign” no fewer than 19 times. And while he didn’t explicitly defend divisions of “tribe and race and religion,” he talked about the importance of nations “preserving the cultures,” which is a more polite way of saying the same thing.
Doctors at the University of Mississippi dissected two chicken nuggets, looked at them under a microscope, and were "astounded."
The chicken nugget can conjure purity. No buns, pickles, or bones. Not many carbs, apart from the breading. This is simplicity delivered economically, flightless birds, protein for the protein-hungry America of today—or, to followers of Michael Pollan, the corn-fed-meat-wrapped-in-corn-preserved-breading-dipped-in-corn-sweetened-goo kind of purity.
Richard D. deShazo, MD, is a distinguished professor of medicine and pediatrics at University of Mississippi Medical Center. He does not see purity. At least, not anymore.
“I was floored. I was astounded,” deShazo said of the moment he looked at a chicken nugget under a microscope.