In his just-released book The Last Train From Hiroshima, Charles Pellegrino quotes one of the survivors of the Hiroshima and Nagasaki atomic bomb blasts as saying that those who survived were, in general, those who looked after their own safety, instead of reaching out to help others. "Those of us who stayed where we were ... who took refuge in the hills behind the hospital when the fires began to spread and close in, happened to escape alive. In short, those who survived the bomb were ... in a greater or lesser degree selfish, self-centered--guided by instinct and not by civilization. And we know it, we who have survived."
But is survival really selfish and uncivilized? Or is it smart? And is going in to rescue others always heroic? Or is it sometimes just stupid? It's a complex question, because there are so many factors involved, and every survival situation is different.
Self-preservation is supposedly an instinct. So one would think that in life-and-death situations, we'd all be very focused on whatever was necessary to survive. But that's not always true. In July 2007, I was having a drink with a friend in Grand Central Station when an underground steam pipe exploded just outside. From where we sat, we heard a dull "boom!" and then suddenly, people were running, streaming out of the tunnels and out the doors.
My friend and I walked quickly and calmly outside, but to get any further, we had to push our way through a crowd of people who were staring, transfixed, at the column of smoke rising from the front of the station. Some people were crying, others were screaming, others were on their cell phones...but the crowd, for the most part, was not doing the one thing that would increase everyone's chances of survival, if in fact a terrorist bomb with god knows what inside it had just gone off--namely, moving away from the area.
We may have an instinct for survival, but it clearly doesn't always kick in the way it should. A guy who provides survival training for pilots told me once that the number one determining factor for survival is simply whether people hold it together in a crisis or fall apart. And, he said, it's impossible to predict ahead of time who's going to hold it together, and who's going to fall apart.
So what is the responsibility of those who hold it together? I remember reading the account of one woman who was in an airliner that crashed on landing. People were frozen or screaming, but nobody was moving toward the emergency exits, even as smoke began to fill the cabin. After realizing that the people around her were too paralyzed to react, she took direct action, crawling over several rows of people to get to the exit. She got out of the plane and survived. Very few others in the plane, which was soon consumed by smoke and fire, did. And afterward, I remember she said she battled a lot of guilt for saving herself instead of trying to save the others.
Could she really have saved the others? Probably not, and certainly not from the back of the plane. Just like the Hiroshima survivors, if she'd tried, she probably would have perished with them. So why do survivors berate themselves for not adding to the loss by attempting the impossible? Perhaps it's because we get very mixed messages about survival ethics.
On the one hand, we're told to put our own oxygen masks on first, and not to jump in the water with a drowning victim. But then the people who ignore those edicts and survive to tell the tale are lauded as heroes. And people who do the "smart" thing are sometimes criticized quite heavily after the fact.
In a famous mountain-climbing accident chronicled in the book and documentary Touching the Void, climber Simon Yates was attempting to rope his already-injured friend Joe Simpson down a mountain in bad weather when the belay went awry. Simpson ended up hanging off a cliff, unable to climb up, and Yates, unable to lift him up and losing his own grip on the mountain, ended up cutting the rope to Simpson to save himself. Miraculously, Simpson survived the 100 foot fall and eventually made his way down the mountain. But Yates was criticized by some for his survival decision, even though the alternative would have almost certainly led to both of their deaths.
In Yates' case, he had time to think hard about the odds, and the possibilities he was facing, and to realize that he couldn't save anyone but himself. But what about people who have to make more instantaneous decisions? If, in fact, survivors are driven by "instinct not civilization," as the Hiroshima survivor put it, how do you explain all those who choose otherwise? Who would dive into icy waters or onto subway tracks or disobey orders to make repeat trips onto a minefield to bring wounded to safety? Are they more civilized than the rest of us? More brave? More noble?
It sounds nice, but oddly enough, most of the people who perform such impulsive rescues say that they didn't really think before acting. Which means they weren't "choosing" civilization over instinct. If survival is an instinct, it seems to me that there must be something equally instinctive that drives us, sometimes, to run into danger instead of away from it.
Perhaps it comes down to the ancient "fight or flight" impulse. Animals confronted with danger will choose to attack it, or run from it, and it's hard to say which one they'll choose, or when. Or maybe humans are such social herd animals, dependent on the herd for survival, that we feel a pull toward others even as we feel a contrary pull toward our own preservation, and the two impulses battle it out within us ... leading to the mixed messages we send each other on which impulse to follow.
Some people hold it together in a crisis and some people fall apart. Some people might run away from danger one day, and toward it the next. We pick up a thousand cues in an instant of crisis and respond in ways that even surprise ourselves, sometimes.
But while we laud those who sacrifice themselves in an attempt to save another, there is a fine line between brave and foolish. There can also be a fine line between smart and selfish. And as a friend who's served in the military for 27 years says, the truth is, sometimes there's no line at all between the two.
Also notable about this brazen show of might is that the missiles traveled through two countries, Iran and Iraq, before hitting their 11 targets in Syria. This means that both countries either gave their permission or simply didn’t confront Putin about the use of their airspace on his birthday.
Why Americans tend more and more to want inexperienced presidential candidates
The presidency, it’s often said, is a job for which everyone arrives unprepared. But just how unprepared is unprepared enough?
Political handicappers weigh presidential candidates’ partisanship, ideology, money, endorsements, consultants, and, of course, experience. Yet they too rarely consider an element of growing importance to voters: freshness. Increasingly, American voters view being qualified for the presidency as a disqualification.
In 2003, I announced in National Journal the 14-Year Rule. The rule was actually discovered by a presidential speechwriter named John McConnell, but because his job required him to keep his name out of print, I graciously stepped up to take credit. It is well known that to be elected president, you pretty much have to have been a governor or a U.S. senator. What McConnell had figured out was this: No one gets elected president who needs longer than 14 years to get from his or her first gubernatorial or Senate victory to either the presidency or the vice presidency.* Surprised, I scoured the history books and found that the rule works astonishingly well going back to the early 20th century, when the modern era of presidential electioneering began.
It leaves people bed-bound and drives some to suicide, but there's little research money devoted to the disease. Now, change is coming, thanks to the patients themselves.
This past July, Brian Vastag, a former science reporter, placed an op-ed with his former employer, the Washington Post. It was an open letter to the National Institutes of Health director Francis Collins, a man Vastag had formerly used as a source on his beat.
“I’ve been felled by the most forlorn of orphan illnesses,” Vastag wrote. “At 43, my productive life may well be over.”
There was no cure for his disease, known by some as chronic fatigue syndrome, Vastag wrote, and little NIH funding available to search for one. Would Collins step up and change that?
“As the leader of our nation’s medical research enterprise, you have a decision to make,” he wrote. “Do you want the NIH to be part of these solutions, or will the nation’s medical research agency continue to be part of the problem?”
A new report details a black market in nuclear materials.
On Wednesday, the Associated Press published a horrifying report about criminal networks in the former Soviet Union trying to sell “radioactive material to Middle Eastern extremists.” At the center of these cases, of which the AP learned of four in the past five years, was a “thriving black market in nuclear materials” in a “tiny and impoverished Eastern European country”: Moldova.
It’s a new iteration of an old problem with a familiar geography. The breakup of the Soviet Union left a superpower’s worth of nuclear weapons scattered across several countries without a superpower’s capacity to keep track of them. When Harvard’s Graham Allison flagged this problem in 1996, he wrote that the collapse of Russia’s “command-and-control society” left nothing secure. To wit:
What will happen to digital collections of books, movies, and music when the tech giants fall?
When you purchase a movie from Amazon Instant Video, you’re not buying it, exactly. It’s more like renting indefinitely.
This distinction matters if your notion of “buying” is that you pay for something once and then you get to keep that thing for as long as you want. Increasingly, in the world of digital goods, a purchasing transaction isn’t that simple.
There are two key differences between buying media in a physical format versus a digital one. First, there’s the technical aspect: Maintaining long-term access to a file requires a hard copy of it—that means, for example, downloading a film, not just streaming from a third party’s server. The second distinction is a bit more complicated, and it has to do with how the law has shaped digital rights in the past 15 years. It helps to think about the experience of a person giving up CDs and using iTunes for music purchases instead.
Somewhere in Europe, a man who goes by the name “Mikro” spends his days and nights targeting Islamic State supporters on Twitter.
In August 2014, a Twitter account affiliated with Anonymous, the hacker-crusader collective, declared “full-scale cyber war” against ISIS: “Welcome to Operation Ice #ISIS, where #Anonymous will do it’s [sic] part in combating #ISIS’s influence in social media and shut them down.”
In July, I traveled to a gloomy European capital city to meet one of the “cyber warriors” behind this operation. Online, he goes by the pseudonym Mikro. He is vigilant, bordering on paranoid, about hiding his actual identity, on account of all the death threats he has received. But a few months after I initiated a relationship with him on Twitter, Mikro allowed me to visit him in the apartment he shares with his girlfriend and two Rottweilers. He works alone from his chaotic living room, using an old, battered computer—not the state-of-the-art setup I had envisaged. On an average day, he told me, he spends up to 16 hours fixed to his sofa. He starts around noon, just after he wakes up, and works late into the night and early morning.
American politicians are now eager to disown a failed criminal-justice system that’s left the U.S. with the largest incarcerated population in the world. But they've failed to reckon with history. Fifty years after Daniel Patrick Moynihan’s report “The Negro Family” tragically helped create this system, it's time to reclaim his original intent.
By his own lights, Daniel Patrick Moynihan, ambassador, senator, sociologist, and itinerant American intellectual, was the product of a broken home and a pathological family. He was born in 1927 in Tulsa, Oklahoma, but raised mostly in New York City. When Moynihan was 10 years old, his father, John, left the family, plunging it into poverty. Moynihan’s mother, Margaret, remarried, had another child, divorced, moved to Indiana to stay with relatives, then returned to New York, where she worked as a nurse. Moynihan’s childhood—a tangle of poverty, remarriage, relocation, and single motherhood—contrasted starkly with the idyllic American family life he would later extol.
Forget the Common Core, Finland’s youngsters are in charge of determining what happens in the classroom.
“The changes to kindergarten make me sick,” a veteran teacher in Arkansas recently admitted to me. “Think about what you did in first grade—that’s what my 5-year-old babies are expected to do.”
The difference between first grade and kindergarten may not seem like much, but what I remember about my first-grade experience in the mid-90s doesn’t match the kindergarten she described in her email: three and a half hours of daily literacy instruction, an hour and a half of daily math instruction, 20 minutes of daily “physical activity time” (officially banned from being called “recess”) and two 56-question standardized tests in literacy and math—on the fourth week of school.
That American friend—who teaches 20 students without an aide—has fought to integrate 30 minutes of “station time” into the literacy block, which includes “blocks, science, magnetic letters, play dough with letter stamps to practice words, books, and storytelling.” But the most controversial area of her classroom isn’t the blocks nor the stamps: Rather, it’s the “house station with dolls and toy food”—items her district tried to remove last year. The implication was clear: There’s no time for play in kindergarten anymore.
The presumptive successor to John Boehner abruptly ended his bid after determining he could not get the support he needed from conservatives.
Behind Kevin McCarthy’s stunning decision Thursday to end his bid for speaker lay a simple calculation: Even if he could scrape together the 218 votes he needed to win the formal House election later this month, he would begin his term a crippled leader unable to unite a party that he said was “deeply divided.”
The majority leader and presumed successor to John Boehner had been widely expected to win the House GOP’s secret-ballot nomination on Thursday. All he needed was a simple majority of the 247-member caucus, and he easily had the votes over long-shot challengers Jason Chaffetz of Utah or Daniel Webster of Florida, who won the endorsement of the renegade House Freedom Caucus. But even if he’d won on Thursday, McCarthy knew he was still short of the threshold he needed on the floor, knowing that Democrats would vote as a bloc against him.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.