Weapons systems that fire autonomously are the apotheosis of elites insulating themselves from accountability.
The aversion I have to autonomous weapons dates back to a bygone afternoon when I sat playing Golden Eye, a James Bond-themed video game that I was progressing through with all the alacrity of its lead character until I reached the jungle level. Expert as I was at evading enemy soldiers, I found myself gunned down in a spray of machine-gun bullets, which turned out to be motion activated. "Oh bollocks," I cursed, determined to stay in character. "That's hardly sporting."
I suppose some of you will think I'm a nutty conspiracy theorist when I inform you that many inside the Defense Department are eager to deploy machines on the battlefield that autonomously kill rather than requiring human intervention to "pull the trigger." It sounds like dystopian fiction. But it has its champions, and most informed observers think it's inevitable that they'll win victories in coming years. There aren't even many objections to autonomous military hardware that doesn't kill: Think of drones that take off, surveil, and land entirely on autopilot. Is the day coming when drones like that are armed, programmed to seek out certain sorts of images, and automated to fire in certain circumstances? The short answer is that, given present trends, it's a realistic possibility.
Noel Sharkey is shaken and stirred -- and he isn't alone. As he recently noted in The Guardian, Human Rights Watch is so concerned that they're calling on international actors to immediately "prohibit the development, production and use of fully autonomous weapons through an international legally binding instrument; and adopt national laws and policies to prohibit the development, production, and use of fully autonomous weapons." Would that President Obama listened.
The same Guardian column flags a quotation from a recent Pentagon directive on autonomous weapons that is worth bearing in mind. The lethal machines of the future could conceivable fail due to "a number of causes," the directive states, "including but not limited to human error, human-machine interaction failures, malfunctions, communications degradation, software coding errors, enemy cyber attacks or infiltration into the industrial supply chain, jamming, spoofing, decoys, other enemy countermeasures or actions, or unanticipated situations on the battlefield."
The arguments against adopting this technology nearly make themselves. So I'll focus on a tangential but important observation.
In recent decades, America's elite -- its elected officials, bureaucrats, and CEOs, for starters -- have succeeded spectacularly at insulating themselves from responsibility for their failures. As the global economy melted down, Wall Street got bailouts. CEOs who preside over shrinking companies still depart with "golden parachute" severance packages. The foreign-policy establishment remains virtually unchanged despite the catastrophic errors so many of its members made during the Iraq War. The conservative movement is failing to advance its ideas or its political prospects, yet its institutional leaders and infotainment personalities are as profitable as ever. It is a self-evidently pernicious trend: Once someone achieves insider status, their future success is less and less dependent on how competently and responsibly they perform.
An innocent who dies at the handlessness of an automated killing machine? How easy to phrase the obligatory apology in the passive voice!
It's no wonder that some military leaders are so eager for the advent of autonomous weapons. At present, if WikiLeaks gets ahold of a video that shows innocents being fired upon, the incident in question can be traced back to an individual triggerman, an officer who gave him orders, and perhaps particular people who provided them with faulty intelligence. But an innocent who dies at the handlessness of an automated killing machine? How easy to phrase the obligatory apology in the passive voice! How implausible that any individual would be held culpable for the failure!
And would there ever be accusations that a supposed mistake was actually an intentional killing being conveniently blamed on autonomy gone wrong? I hardly think such suggestions would come from within the establishment. Levying that sort of accusation against a sitting president or the "brave men and women of the United States military" is precisely the sort of thing that bipartisan Washington, D.C., culture declares beyond the pale of reasonable discourse. And when autonomous Chinese drones "accidently" fire upon some Tibetans? Even a genuine accident in America's past would make it that much harder to refrain from giving the Chinese the benefit of the doubt that we ourselves had requested.
Most areas of American culture would benefit if the relevant set of elites were more accountable, but nowhere is in individual accountability more important than when death is being meted out. The secrecy that surrounds the national-security establishment and outsourcing drone strikes to the CIA already introduces a problematic lack of accountability to the War on Terrorism. Autonomous killing machines would exacerbate the problem more than anything else I can imagine.
Conor Friedersdorf is a staff writer at The Atlantic, where he focuses on politics and national affairs. He lives in Venice, California, and is the founding editor of The Best of Journalism, a newsletter devoted to exceptional nonfiction.
It happened gradually—and until the U.S. figures out how to treat the problem, it will only get worse.
It’s 2020, four years from now. The campaign is under way to succeed the president, who is retiring after a single wretched term. Voters are angrier than ever—at politicians, at compromisers, at the establishment. Congress and the White House seem incapable of working together on anything, even when their interests align. With lawmaking at a standstill, the president’s use of executive orders and regulatory discretion has reached a level that Congress views as dictatorial—not that Congress can do anything about it, except file lawsuits that the divided Supreme Court, its three vacancies unfilled, has been unable to resolve.
On Capitol Hill, Speaker Paul Ryan resigned after proving unable to pass a budget, or much else. The House burned through two more speakers and one “acting” speaker, a job invented following four speakerless months. The Senate, meanwhile, is tied in knots by wannabe presidents and aspiring talk-show hosts, who use the chamber as a social-media platform to build their brands by obstructing—well, everything. The Defense Department is among hundreds of agencies that have not been reauthorized, the government has shut down three times, and, yes, it finally happened: The United States briefly defaulted on the national debt, precipitating a market collapse and an economic downturn. No one wanted that outcome, but no one was able to prevent it.
Their degrees may help them secure entry-level jobs, but to advance in their careers, they’ll need much more than technical skills.
American undergraduates are flocking to business programs, and finding plenty of entry-level opportunities. But when businesses go hunting for CEOs or managers, “they will say, a couple of decades out, that I’m looking for a liberal arts grad,” said Judy Samuelson, executive director of the Aspen Institute’s Business and Society Program.
That presents a growing challenge to colleges and universities. Students are clamoring for degrees that will help them secure jobs in a shifting economy, but to succeed in the long term, they’ll require an education that allows them to grow, adapt, and contribute as citizens—and to build successful careers. And it’s why many schools are shaking up their curricula to ensure that undergraduate business majors receive something they may not even know they need—a rigorous liberal-arts education.
Footnotes. Numbers. Detailed proposals. The Donald’s economic address at an aluminum factory in Pennsylvania had it all.
Donald Trump must have hired some researchers.
The famously off-the-cuff orator delivered a surprisingly specific speech on trade, making seven detailed policy pledges while predicting that Hillary Clinton, if elected, would tweak and then sign the enormous Pacific trade pact she now opposes as a candidate for president.
Trump’s address to workers at a Pennsylvania aluminum factory continued his recent effort to lift both the tone and substance of his speeches. But it marked an even bigger departure in its sheer wonkiness.First, his campaign sent out the prepared remarks with 128 footnotes. And in delivering the speech from a teleprompter, Trump delved into such granular policy detail that he referenced specific sections of decades-old trade laws and vowed to invoke “Article 2205” of the North American Free Trade Agreement. Doing so, he said, would withdraw the U.S. from NAFTA if its trading partners don’t agree to renegotiate the Clinton-era accord.
At least 41 people were killed and scores injured in bombings Tuesday night at Ataturk airport, one of the busiest in Europe.
Here’s what we know:
—Three suicide bombers opened fire and blew themselves up at Istanbul’s Ataturk International Airport Tuesday night, killing at least 41 people and wounding more than 200. Officials suspect the Islamic State was behind the attack.
—Ataturk is the third-busiest airport in Europe, after London Heathrow and Paris Charles de Gaulle.
—We’re live-blogging the aftermath of the attack, and you can read how it unfolded below. All updates are in Eastern Standard Time (GMT -5).
The Istanbul Governor's Office said the dead included 10 foreign nationals, including three who had dual nationality. The office also said that of the 239 people wounded in the attack, 109 had been discharged from hospital.
Fears of civilization-wide idleness are based too much on the downsides of being unemployed in a society premised on the concept of employment.
People have speculated for centuries about a future without work, and today is no different, with academics, writers, and activists once again warning that technology is replacing human workers. Some imagine that the coming work-free world will be defined by inequality: A few wealthy people will own all the capital, and the masses will struggle in an impoverished wasteland.
A different, less paranoid, and not mutually exclusive prediction holds that the future will be a wasteland of a different sort, one characterized by purposelessness: Without jobs to give their lives meaning, people will simply become lazy and depressed. Indeed, today’s unemployed don’t seem to be having a great time. One Gallup poll found that 20 percent of Americans who have been unemployed for at least a year report having depression, double the rate for working Americans. Also, some research suggests that the explanation for rising rates of mortality, mental-health problems, and addiction among poorly-educated, middle-aged people is a shortage of well-paid jobs. Another study shows that people are often happier at work than in their free time. Perhaps this is why many worry about the agonizing dullness of a jobless future.
The impenetrable Supreme Court justice’s leftward shift and his latest blockbuster of a term.
Some years ago, Dahlia Lithwick and I christened Justice Anthony Kennedy “the Sphinx of Sacramento.” Throughout his nearly 30 years on the Supreme Court, Kennedy’s mind has often seemed like a distant and mysterious country, with its own language and folkways beyond the ken of normal Americans.
Seldom has it seemed more puzzling than at the end of the Court’s 2015 to 2016 term. Kennedy’s votes in two crucial cases—one dealing with affirmative action and the other with abortion—procured important, and surprisingly sweeping, liberal victories on high-profile issues that conservatives care desperately about.
What is the Sphinx up to?
I often violently disagree with Kennedy’s legal judgment, but I cannot help but admire his personal qualities. In public, and from what I can tell in private, he is a man of deep kindness, courtesy, and benevolence, embodying the sort of small-town civic virtue one would expect from a man who left the snake pit of a big San Francisco firm to go into solo practice in Sacramento, California. His opinions seldom display the petty meanness that sometimes disfigures his colleagues’ work.
The Freddie Gray trials illustrate the inability of criminal prosecutions to halt police brutality.
When Baltimore police officer Caesar Goodson Jr., was acquitted Thursday of all charges related to the death of Freddie Gray, the one emotion absent from the courtroom, social media, and the crowds of protesters in the city was surprise. The cases of all six officers alleged to be involved in Gray’s April 2015 death have been tossed about in a sea of strange legal wrangling and reshuffling, but without much real suspense. The trial of Officer Edward Nero ended in a judge’s acquittal, and that of Officer William Porter in a hung jury. All six officers charged in the case remain on administrative, drawing full salaries, pending the outcome of an internal investigation. But it’s likely that these officers will share the fate of most officers accused of killing black people in the line of duty: a return to police work.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
The Supreme Court declined to hear a major religious-freedom case on Tuesday, showing how much things have changed since Hobby Lobby.
Two years ago, the U.S. Supreme Court handed down a controversial 5-4 ruling about birth control and religion, Burwell v. Hobby Lobby Stores, Inc. Because of the ruling, private companies owned by religious people, including the craft-supply chain Hobby Lobby, can now refuse to cover certain kinds of birth control in their employee insurance plans, a requirement that was put in place by the 2010 Affordable Care Act. Supporters of the ruling claimed it as a triumph for religious freedom and an important precedent for cases about conscience-based objections to contraception.
Two years later, a pharmacy chain in Washington state, Stormans Inc., which operates a store in Olympia called Ralph’s Thriftway, has been denied a hearing before the Supreme Court. The pharmacy’s owners, along with two other pharmacists who are also plaintiffs in the case, Stormans, Inc. v. Wiesman, refused to stock emergency contraception, including Plan B and ella, for religious reasons—they believe the pills are effectively abortifacients. Long-standing state regulations require Washington pharmacies to stock a “representative assortment of drugs in order to meet the pharmaceutical needs of ... patients.” The requirements were updated in 2007, specifying that pharmacies must deliver all FDA-approved drugs to customers; they can’t refer people to get medication at a different location for any kind of religious or moral reasons.
There’s more to life than can be measured in monetary returns.
What’s a good use of money?
For investors, that question comes down to a relatively straightforward calculation: Which of the available options has the greatest expected return on the investment?
But investors are far from the only people who are using the “return on investment” framework to weigh different options. “This has become a very, very powerful tool for decision making, not only in business, but in our culture as a whole,” said Moses Pava, an ethicist and a dean of the Sy Syms School of Business at Yeshiva University, at the Aspen Ideas Festival, co-hosted by the Aspen Institute and The Atlantic. In particular, Pava sees this kind of thinking dominating the world of education, both on the part of students in choosing schools and majors, and on the part of school in how they market themselves to potential enrollees. This, he says, will not end well for liberal arts schools.