>In February 2008, a pair of suicide bombers struck the Israeli town of Dimona. One of the attackers detonated his explosive vest, killing an Israeli, and injuring nine others. The accomplice was shot before he could trigger his device. A bomb disposal robot then defused the bomb, and ran over the terrorist's body to make sure he wasn't carrying any more explosives.
The encounter symbolized the emergence of two opponents: robots and suicide terrorists. States and non-state actors have moved in opposite directions in the delivery of firepower. Advanced countries like the United States and Israel have developed unmanned weapons. By contrast, terrorist adversaries have adopted the ultimate manned weapon. On one side, you have a robot operated by a technician thousands of miles away. On the other side, you have an individual who is physically present when the weapon explodes. War is a contest between the impersonal and the personal.
Photo: Haim Horenstein/Getty
In the opening act of the 1991 Gulf War, U.S. pilots flew F-117A Nighthawks into Baghdad, hitting targets with laser-guided bombs. Today, two decades later, unmanned drone aircraft lead the fight against the Taliban and Al Qaeda. Directed by joystick-wielding pilots sitting in trailers in the United States, the Predator and the Reaper drone are able to stay in the air for at least 14 hours, watching and killing. The supposedly dovish President Obama has massively stepped up the drone war in the border regions of Afghanistan and Pakistan.
As Peter Singer wrote in his fascinating book Wired for War we are in the midst of a new chapter in warfare, with robots moving to center stage. The Predator and Reaper now have a brother on the ground. The SWORDS, or Special Weapons Observation Reconnaissance Detection System, is a robot chassis that can mount an M-16 rifle or a grenade launcher.
But just when national militaries have evolved from manned to unmanned operations, non-state adversaries have gone the opposite route, with humans delivering the payload. In 1993, Ramzi Yousef followed the traditional terrorist playbook: planting a bomb inside the World Trade Center in New York City, and then fleeing as quickly as possible. Eight years later, Yousef's uncle, Khalid Sheikh Mohammed, masterminded a different strategy, with terrorists personally guiding aircraft into the Twin Towers.
To be sure, suicide bombings are only a small fraction of overall terrorist attacks. But they are on the rise. The current era of suicide terrorism began in Lebanon in the early 1980s, and quickly spread to civil wars in Sri Lanka, Turkey, and Chechnya. After 9/11, there was a dramatic uptick in suicide bombings in countries as diverse as Saudi Arabia, Morocco, Indonesia, Pakistan, India, Britain, and especially Iraq--where there were at least 783 attacks from May 2003 to July 2010. In the early years of the Afghan War, there were only a handful of suicide bombings, but in 2009 there were over 180 incidents.
The United States hopes to thrive in this brave new world of robots and suicide terrorists. Americans have long used machines to save soldiers' lives. And robots relish jobs that are dull or dangerous. Drones can patrol the battlefield around the clock. The SWORDS robot can hit its target with incredible accuracy. One day, a swarm of miniature insect robots armed with cameras may buzz around cityscapes, removing the fog of war from urban fighting.
But robots can lack a human's capacity to adapt to sudden changes on the battlefield. This, of course, is the suicide terrorist's ace card. He can switch target at the last second to maximize destruction, or fine-tune the kill. The Tamil Tigers of Sri Lanka used suicide bombers to get close to, and assassinate, political officials, including former Indian Prime Minister Rajiv Gandhi in 1991.
For optimists, the era of robots and suicide terrorists could allow the United States to land a one-two psychological punch. American automata send a powerful message: step on me and face a relentless wave of robot warriors. Shocked and awestruck, enemies will be left feeling helpless. Meanwhile, the brutality of suicide bombings marginalizes Al Qaeda's cause and helps us win the battle for hearts and minds. Our iron fist combined with the enemy's fanaticism leave only one winner.
Pessimists worry, however, about how the optics will look. The reliance on robots can make the United States appear both overbearing and vulnerable--just the combination to inspire resistance. Goliath bullies David with advanced technology. But Goliath's strength belies a fatal weakness--his craven fear of death.
Rami Khouri, a scholar and editor based in Beirut, described how the Lebanese viewed the Israeli drones in the 2006 war in Lebanon: "the enemy is using machines to fight from afar. Your defiance in the face of it shows your heroism, your humanity...The average person sees it as just another sign of coldhearted, cruel Israelis and Americans, who are also cowards because they send out machines to fight us." America's population is as frightened as the lion from the Wizard of Oz. And its robots are as heartless as the tin man. Americans will not face death, whereas its enemies embrace it. In anti-American circles, the suicide terrorist may look like a brave rebel resisting the evil Galactic Empire.
The rise of robots and suicide terrorists could also make wars more likely. Suicide attacks such as 9/11 are so horrific they provide a powerful casus belli, rallying Americans to fight. And if presidents can respond by unleashing robots rather than citizens, with less fear of flag-laden coffins coming back, they may be even more tempted to grasp the SWORDS.
The billionaire’s bid for the nomination was opposed by many insiders—but his success reveals the ascendance of other elements of the party coalition.
In The Party Decides, an influential book about how presidential nominees are selected, political scientists John Zaller, Hans Noel, David Karol, and Marty Cohen argue that despite reforms designed to wrest control of the process from insiders at smoke-filled nominating conventions, political parties still exert tremendous influence on who makes it to general elections. They do so partly through “invisible primaries,” the authors posited—think of how the Republican establishment coalesced around George W. Bush in 2000, long before any ballots were cast, presenting him as a fait accompli to voters who’d scarcely started to think about the election; or how insider Democrats elevated Hillary Clinton this election cycle.
The comedian's n-bomb at the White House Correspondents’ Dinner highlights a generational shift in black culture.
Georgia McDowell was born the daughter of farmers and teachers in North Carolina in 1902. She was my great-grandmother, and she taught me to read, despite the dementia that clouded her mind and the dyslexia that interrupted mine. I loved Miss Georgia, though she kept as many hard lines in her home as she had in her classrooms. One of the hardest lines was common to many black households: The word “nigger” and all of its derivatives were strict taboos in person, on television, and on radio from any source, black or otherwise, so long as she lived and breathed. She’d kept the taboo through decades of teaching black students and raising black children. For most of my childhood, the taboo was absolute.
The Rhode Island Marine Archaeology Project said it knows the location of British explorer Captain James Cook’s famous vessel.
By the time it was lost, HMS Endeavour had been conscripted into the American Revolutionary War and renamed. Years before, the British explorer Captain James Cook had sailed the 100-foot oak ship on his first voyage to Australia.
After Cook’s first voyage, the Endeavour was largely forgotten. While searching for Australia, Cook wrecked the ship into a coral reef, so upon return to England, workers at a dockyard refitted it as a naval transport. Cook got a new ship. Meanwhile the Endeavour delivered provisions to the Falkland Islands in 1772, was sold in 1775, then renamed the Lord Sandwich, and unceremoniously, and purposefully, sunk in 1778 in the shallow waters of a Rhode Island harbor to block advancing French ships that’d come to help the Americans. And there the Lord Sandwich still lies.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
The Republican front-runner’s repetition of a blatantly ridiculous story about Ted Cruz’s father shows his symbiotic relationship with the press.
Brace yourselves for shock, but Donald Trump said something ridiculous and baseless Tuesday morning. The subject was Rafael Cruz, Cuban-born father of his primary remaining rival, Senator Ted Cruz.
“His father was with Lee Harvey Oswald prior to Oswald's being—you know, shot. I mean, the whole thing is ridiculous,” Trump said during a phone interview with Fox News. “What is this, right prior to his being shot, and nobody even brings it up. I mean, they don't even talk about that. That was reported, and nobody talks about it.”
Let’s clear a few things up: It has been reported, which is why Trump knows about it, but it was reported in the National Enquirer. Also there is no evidence for it; it’s bogus. Yes, the National Enquirer has been right about some things in the past, most notably John Edwards’s affair; no, that does not prove that it is right about this.
For some, abandoning expensive urban centers would be a huge financial relief.
Neal Gabler has been a formative writer for me: His Winchell: Gossip, Power, and the Culture of Celebrity was one of the books that led me to think about leaving scholarship behind and write nonfiction instead, and Walt Disney: The Triumph of the American Imagination was the first book I reviewed as a freelance writer. To me, he exemplifies the best mix of intensive archival research and narrative kick.
So reading his recent essay, "The Secret Shame of Middle-Class Americans," was a gut punch: First, I learned about a role model of mine whose talent, in my opinion, should preclude him from financial woes. And, then, I was socked by narcissistic outrage: I, too, struggle with money! I, too, am a failing middle-class American! I, too, am a writer of nonfiction who should be better compensated!
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
When Apple announced in 2013 that its next iPhone would include a fingerprint reader, it touted the feature as a leap forward in security. Many people don’t set up a passcode on their phones, Apple SVP Phil Schiller said at the keynote event where the Touch ID sensor was unveiled, but making security easier and faster might convince more users to protect their phones. (Of course, Apple wasn’t the first to stuff a fingerprint reader into a flagship smartphone, but the iPhone’s Touch ID took the feature mainstream.)
The system itself proved quite secure—scanned fingerprints are stored, encrypted, and processed locally rather than being sent to Apple for verification—but the widespread use of fingerprint data to unlock iPhones worried some experts. One of the biggest questions that hung over the transition was legal rather than technical: How might a fingerprint-secured iPhone be treated in a court of law?
The new recipe for one of McDonald’s iconic foodstuffs contains no artificial preservatives, the latest symbolic change for an item that has always reflected conflicting priorities about food.
Last week, McDonald’s announced that it is testing out a new recipe for its iconic (and often maligned) Chicken McNugget. The revamped version of the Happy Meal staple will soon be free of artificial preservatives, instead containing lemon-juice solids and rice starch, in what the company calls “a simpler recipe that parents can feel good about while keeping the same great taste they know and love.”
This seems like an important milestone in the evolution of a product that, since its debut in 1983, has reflected the push and pull between American consumers’ interest in taste, value, and nutrition.
A broadly accepted interpretation of what wrought the McNugget is the 1977 release of the first-ever dietary guidelines by the Senate Select Committee on Nutrition and Human Needs. One result was a drop in the consumption of red meat, which left the hamburger-heavy Golden Arches in search of a new item for the changing American palette.