Weapons systems that fire autonomously are the apotheosis of elites insulating themselves from accountability.
The aversion I have to autonomous weapons dates back to a bygone afternoon when I sat playing Golden Eye, a James Bond-themed video game that I was progressing through with all the alacrity of its lead character until I reached the jungle level. Expert as I was at evading enemy soldiers, I found myself gunned down in a spray of machine-gun bullets, which turned out to be motion activated. "Oh bollocks," I cursed, determined to stay in character. "That's hardly sporting."
I suppose some of you will think I'm a nutty conspiracy theorist when I inform you that many inside the Defense Department are eager to deploy machines on the battlefield that autonomously kill rather than requiring human intervention to "pull the trigger." It sounds like dystopian fiction. But it has its champions, and most informed observers think it's inevitable that they'll win victories in coming years. There aren't even many objections to autonomous military hardware that doesn't kill: Think of drones that take off, surveil, and land entirely on autopilot. Is the day coming when drones like that are armed, programmed to seek out certain sorts of images, and automated to fire in certain circumstances? The short answer is that, given present trends, it's a realistic possibility.
Noel Sharkey is shaken and stirred -- and he isn't alone. As he recently noted in The Guardian, Human Rights Watch is so concerned that they're calling on international actors to immediately "prohibit the development, production and use of fully autonomous weapons through an international legally binding instrument; and adopt national laws and policies to prohibit the development, production, and use of fully autonomous weapons." Would that President Obama listened.
The same Guardian column flags a quotation from a recent Pentagon directive on autonomous weapons that is worth bearing in mind. The lethal machines of the future could conceivable fail due to "a number of causes," the directive states, "including but not limited to human error, human-machine interaction failures, malfunctions, communications degradation, software coding errors, enemy cyber attacks or infiltration into the industrial supply chain, jamming, spoofing, decoys, other enemy countermeasures or actions, or unanticipated situations on the battlefield."
The arguments against adopting this technology nearly make themselves. So I'll focus on a tangential but important observation.
In recent decades, America's elite -- its elected officials, bureaucrats, and CEOs, for starters -- have succeeded spectacularly at insulating themselves from responsibility for their failures. As the global economy melted down, Wall Street got bailouts. CEOs who preside over shrinking companies still depart with "golden parachute" severance packages. The foreign-policy establishment remains virtually unchanged despite the catastrophic errors so many of its members made during the Iraq War. The conservative movement is failing to advance its ideas or its political prospects, yet its institutional leaders and infotainment personalities are as profitable as ever. It is a self-evidently pernicious trend: Once someone achieves insider status, their future success is less and less dependent on how competently and responsibly they perform.
An innocent who dies at the handlessness of an automated killing machine? How easy to phrase the obligatory apology in the passive voice!
It's no wonder that some military leaders are so eager for the advent of autonomous weapons. At present, if WikiLeaks gets ahold of a video that shows innocents being fired upon, the incident in question can be traced back to an individual triggerman, an officer who gave him orders, and perhaps particular people who provided them with faulty intelligence. But an innocent who dies at the handlessness of an automated killing machine? How easy to phrase the obligatory apology in the passive voice! How implausible that any individual would be held culpable for the failure!
And would there ever be accusations that a supposed mistake was actually an intentional killing being conveniently blamed on autonomy gone wrong? I hardly think such suggestions would come from within the establishment. Levying that sort of accusation against a sitting president or the "brave men and women of the United States military" is precisely the sort of thing that bipartisan Washington, D.C., culture declares beyond the pale of reasonable discourse. And when autonomous Chinese drones "accidently" fire upon some Tibetans? Even a genuine accident in America's past would make it that much harder to refrain from giving the Chinese the benefit of the doubt that we ourselves had requested.
Most areas of American culture would benefit if the relevant set of elites were more accountable, but nowhere is in individual accountability more important than when death is being meted out. The secrecy that surrounds the national-security establishment and outsourcing drone strikes to the CIA already introduces a problematic lack of accountability to the War on Terrorism. Autonomous killing machines would exacerbate the problem more than anything else I can imagine.
Conor Friedersdorf is a staff writer at The Atlantic, where he focuses on politics and national affairs. He lives in Venice, California, and is the founding editor of The Best of Journalism, a newsletter devoted to exceptional nonfiction.
Science says lasting relationships come down to—you guessed it—kindness and generosity.
Every day in June, the most popular wedding month of the year, about 13,000 American couples will say “I do,” committing to a lifelong relationship that will be full of friendship, joy, and love that will carry them forward to their final days on this earth.
Except, of course, it doesn’t work out that way for most people. The majority of marriages fail, either ending in divorce and separation or devolving into bitterness and dysfunction. Of all the people who get married, only three in ten remain in healthy, happy marriages, as psychologist Ty Tashiro points out in his book The Science of Happily Ever After, which was published earlier this year.
Social scientists first started studying marriages by observing them in action in the 1970s in response to a crisis: Married couples were divorcing at unprecedented rates. Worried about the impact these divorces would have on the children of the broken marriages, psychologists decided to cast their scientific net on couples, bringing them into the lab to observe them and determine what the ingredients of a healthy, lasting relationship were. Was each unhappy family unhappy in its own way, as Tolstoy claimed, or did the miserable marriages all share something toxic in common?
The current system for gaining entry to elite colleges discourages unique passions and deems many talented students ineligible.
March madness is almost here. No, I’m not referring to the college-basketball playoffs; I’m alluding to the anxious waiting of young people and their families of word about their fate from the highly selective colleges of America. And I’m talking as well about those who are about to venture forth on the ritualistic campus tours to determine where they will apply next fall. What few of these families realize is how broken the admission system is at these selective colleges.
At these institutions of higher learning, the goal is to “shape a class,” which involves trying to admit qualified and diverse students who will learn from each other as well as from their experiences in the classroom. These are the students who have the greatest potential to use their education in productive ways and to contribute to their own well-being and to the needs of the larger society. Diversity is not defined here as solely pertaining to race, ethnicity, or gender, although that weighs on decisions, but also on a range of interests and talents that students can develop and share with others during their college years. These are high-minded goals.
Trump’s prescient opposition to the invasion is an important part of his claim to sound judgment. And he is making it up. I would know.
I respect and admire Donald Trump (yes, I wrote those words to begin a sentence) for flat-out arguing to GOP crowds that the Iraq war was a catastrophic mistake.
It was additionally amazing and heartening to see him, in last night’s WWE-style brawl-debate, finally call B.S. on a persistent and amazing claim by the otherwise-generally-reality-based Jeb Bush. When pressed about the Bush-Cheney record on office, Jeb’s final line of defense throughout the campaign has been, “whatever else you can say my brother, he kept us safe!”
Yes, perfectly safe! Except for, ummm, that one time. Trump finally had the lack of politesse to say so directly to Jeb Bush, only to receive boos from the crowd.
On Saturday, the GOP dispensed with concern about keeping up appearances—and put long-simmering anger on display.
Perhaps the most haunting memory of the night will be the audience. Previous presidential debates have banned cheering and booing. Saturday night’s Republican debate in Greenville was marked by both. Permitted or not, the rowdy crowd ventilated its feelings without concern for how it looked or sounded to the viewers at home.
This unconcern for appearances was a Republican theme of the weekend. Hours before the debate opened, news broke that Supreme Court Justice Antonin Scalia had died. Candidates Ted Cruz and Marco Rubio promptly issued statements opining that any appointing any replacement should be left to the next president. It’s not unheard of for candidates to express emotive positions adopted for political advantage. But that same evening, Senate Majority Leader Mitch McConnell joined in, with a statement ruling out any Senate action on any Supreme Court nominee, no matter who it might be.
The staunchly Catholic U.S. Supreme Court justice was known for his acidly conservative opinions, but ultimately, he prioritized the Constitution over the Church.
“How can the Court possibly assert that ‘the First Amendment mandates governmental neutrality between … religion and nonreligion’?” the U.S. Supreme Court Justice Antonin Scalia wrote in 2005, arguing that two Kentucky counties should be able to display the Ten Commandments in their courthouses. “Who says so? Surely not the words of the Constitution.”
This moment, with Scalia’s trademark snark, nicely sums up the paradox of how his religious views influenced his Supreme Court career. The justice, who died Saturday, consistently argued that the United States is fundamentally religious, meaning that the government shouldn’t have to avoid religious displays—nativity scenes on public property, prayers at townhall meetings, and the like. His Roman Catholic faith often seemed to lurk in the background of his opinions, especially in cases involving abortion and homosexuality. But above all, he was committed to a literal, originalist interpretation of the Constitution, along with strict attention to the texts of federal and state laws. His views didn’t always align with those of the Church, and he didn’t always side with people making religious-freedom claims.
How you arrange the plot points of your life into a narrative can shape who you are—and is a fundamental part of being human.
In Paul Murray's novel Skippy Dies, there’s a point where the main character, Howard, has an existential crisis.“‘It’s just not how I expected my life would be,'" he says.
“‘What did you expect?’” a friend responds.
“Howard ponders this. ‘I suppose—this sounds stupid, but I suppose I thought there’d be more of a narrative arc.’”
But it's not stupid at all. Though perhaps the facts of someone’s life, presented end to end, wouldn't much resemble a narrative to the outside observer, the way people choose to tell the stories of their lives, to others and—crucially—to themselves, almost always does have a narrative arc. In telling the story of how you became who you are, and of who you're on your way to becoming, the story itself becomes a part of who you are.
A profanity-filled new self-help book argues that life is kind of terrible, so you should value your actions over your emotions.
Put down the talking stick. Stop fruitlessly seeking "closure" with your peevish co-worker. And please, don't bother telling your spouse how annoying you find their tongue-clicking habit—sometimes honesty is less like a breath of fresh air and more like a fart. That’s the argument of Michael Bennett and Sarah Bennett, the father-daughter duo behind the new self-help book F*ck Feelings.
The elder Bennett is a psychiatrist and American Psychiatric Association distinguished fellow. His daughter is a comedy writer. Together, they provide a tough-love, irreverent take on “life's impossible problems.” The crux of their approach is that life is hard and negative emotions are part of it. The key is to see your “bullshit wishes” for just what they are (bullshit), and instead to pursue real, achievable goals.
African Americans are converging around an abundance of issues, wanting to be heard and employing new strategies to achieve it.
At 8 years old, I nervously stood in a third-grade classroom listening to the two black women standing over me. One was Lillie Costin, not only the first black teacher I ever had, but the first black teacher I’d ever seen. The other was my mother, who told Costin, “Teddy is smart and well-behaved, but don’t hesitate to pop him if he acts up.” Costin—God bless ’er—told my mother she would keep an eye on me. And then, as I sheepishly took my seat among the gaggle of my new giggling classmates, the two ladies exchanged The Look.
In the simplest terms, The Look is unspoken dialogue that confirms both sides are, as black parishioners often say, “on one accord.” In my case, it was a mother’s plea and a sister’s promise to pay special attention to this child and not allow him to get lost in the system. It wasn’t an agreement for favoritism; it was a pact to stay particularly attuned to my development and ensure I was not shut out from any opportunity. They both knew that no one understands the plight of a black student better than a black teacher.
Fredrickson, a leading researcher of positive emotions at the University of North Carolina at Chapel Hill, presents scientific evidence to argue that love is not what we think it is. It is not a long-lasting, continually present emotion that sustains a marriage; it is not the yearning and passion that characterizes young love; and it is not the blood-tie of kinship.
Rather, it is what she calls a "micro-moment of positivity resonance." She means that love is a connection, characterized by a flood of positive emotions, which you share with another person—any other person—whom you happen to connect with in the course of your day. You can experience these micro-moments with your romantic partner, child, or close friend. But you can also fall in love, however momentarily, with less likely candidates, like a stranger on the street, a colleague at work, or an attendant at a grocery store. Louis Armstrong put it best in "It's a Wonderful World" when he sang, "I see friends shaking hands, sayin 'how do you do?' / They're really sayin', 'I love you.'"
The Republican frontrunner repudiated a long litany of party orthodoxies in a contentious debate—but will that hurt his candidacy, or help it?
Donald Trump blamed the Bush administration for failing to heed CIA warnings before 9/11; denounced the Iraq War for destabilizing the Middle East; defended the use of eminent domain; promised to save Social Security without trimming benefits; and credited Planned Parenthood for “wonderful things having to do with women's health.”
He’s fresh off a crushing victory in New Hampshire, and the prohibitive favorite in the polls in South Carolina. Will his flouting of Republican orthodoxy sink his chances—or is it his very willingness to embrace these heterodox stances that has fueled his rise?
Even his rivals no longer seem certain of the answer. Jeb Bush, at one point, called Trump “a man who insults his way to the nomination.” He sounded like a man ruing a race that has run away from him.