In his just-released book The Last Train From Hiroshima, Charles Pellegrino quotes one of the survivors of the Hiroshima and Nagasaki atomic bomb blasts as saying that those who survived were, in general, those who looked after their own safety, instead of reaching out to help others. "Those of us who stayed where we were ... who took refuge in the hills behind the hospital when the fires began to spread and close in, happened to escape alive. In short, those who survived the bomb were ... in a greater or lesser degree selfish, self-centered--guided by instinct and not by civilization. And we know it, we who have survived."
But is survival really selfish and uncivilized? Or is it smart? And is going in to rescue others always heroic? Or is it sometimes just stupid? It's a complex question, because there are so many factors involved, and every survival situation is different.
Self-preservation is supposedly an instinct. So one would think that in life-and-death situations, we'd all be very focused on whatever was necessary to survive. But that's not always true. In July 2007, I was having a drink with a friend in Grand Central Station when an underground steam pipe exploded just outside. From where we sat, we heard a dull "boom!" and then suddenly, people were running, streaming out of the tunnels and out the doors.
My friend and I walked quickly and calmly outside, but to get any further, we had to push our way through a crowd of people who were staring, transfixed, at the column of smoke rising from the front of the station. Some people were crying, others were screaming, others were on their cell phones...but the crowd, for the most part, was not doing the one thing that would increase everyone's chances of survival, if in fact a terrorist bomb with god knows what inside it had just gone off--namely, moving away from the area.
We may have an instinct for survival, but it clearly doesn't always kick in the way it should. A guy who provides survival training for pilots told me once that the number one determining factor for survival is simply whether people hold it together in a crisis or fall apart. And, he said, it's impossible to predict ahead of time who's going to hold it together, and who's going to fall apart.
So what is the responsibility of those who hold it together? I remember reading the account of one woman who was in an airliner that crashed on landing. People were frozen or screaming, but nobody was moving toward the emergency exits, even as smoke began to fill the cabin. After realizing that the people around her were too paralyzed to react, she took direct action, crawling over several rows of people to get to the exit. She got out of the plane and survived. Very few others in the plane, which was soon consumed by smoke and fire, did. And afterward, I remember she said she battled a lot of guilt for saving herself instead of trying to save the others.
Could she really have saved the others? Probably not, and certainly not from the back of the plane. Just like the Hiroshima survivors, if she'd tried, she probably would have perished with them. So why do survivors berate themselves for not adding to the loss by attempting the impossible? Perhaps it's because we get very mixed messages about survival ethics.
On the one hand, we're told to put our own oxygen masks on first, and not to jump in the water with a drowning victim. But then the people who ignore those edicts and survive to tell the tale are lauded as heroes. And people who do the "smart" thing are sometimes criticized quite heavily after the fact.
In a famous mountain-climbing accident chronicled in the book and documentary Touching the Void, climber Simon Yates was attempting to rope his already-injured friend Joe Simpson down a mountain in bad weather when the belay went awry. Simpson ended up hanging off a cliff, unable to climb up, and Yates, unable to lift him up and losing his own grip on the mountain, ended up cutting the rope to Simpson to save himself. Miraculously, Simpson survived the 100 foot fall and eventually made his way down the mountain. But Yates was criticized by some for his survival decision, even though the alternative would have almost certainly led to both of their deaths.
In Yates' case, he had time to think hard about the odds, and the possibilities he was facing, and to realize that he couldn't save anyone but himself. But what about people who have to make more instantaneous decisions? If, in fact, survivors are driven by "instinct not civilization," as the Hiroshima survivor put it, how do you explain all those who choose otherwise? Who would dive into icy waters or onto subway tracks or disobey orders to make repeat trips onto a minefield to bring wounded to safety? Are they more civilized than the rest of us? More brave? More noble?
It sounds nice, but oddly enough, most of the people who perform such impulsive rescues say that they didn't really think before acting. Which means they weren't "choosing" civilization over instinct. If survival is an instinct, it seems to me that there must be something equally instinctive that drives us, sometimes, to run into danger instead of away from it.
Perhaps it comes down to the ancient "fight or flight" impulse. Animals confronted with danger will choose to attack it, or run from it, and it's hard to say which one they'll choose, or when. Or maybe humans are such social herd animals, dependent on the herd for survival, that we feel a pull toward others even as we feel a contrary pull toward our own preservation, and the two impulses battle it out within us ... leading to the mixed messages we send each other on which impulse to follow.
Some people hold it together in a crisis and some people fall apart. Some people might run away from danger one day, and toward it the next. We pick up a thousand cues in an instant of crisis and respond in ways that even surprise ourselves, sometimes.
But while we laud those who sacrifice themselves in an attempt to save another, there is a fine line between brave and foolish. There can also be a fine line between smart and selfish. And as a friend who's served in the military for 27 years says, the truth is, sometimes there's no line at all between the two.
The long-running cartoon’s representation of Judaism was one of the first on television.
Growing up in south London, and then in the largely Catholic town of Manhasset on Long Island, I didn’t encounter many families who looked, sounded, or behaved like mine. In England, my experiences were limited to either my mother’s family, who were all Orthodox Jews, strictly observing the Sabbath and keeping kosher, and to the families of my classmates, who were invariably all gentiles. In Manhasset, I didn’t even have the Orthodox to relate to. So one of my main comforts in both places came from the Pickles family, who—with its big-haired, neurotic, doting mother and its old-world, Yiddish-mumbling grandparents—instantly made me feel at home. It also helped that I could spend time with the Pickles family whenever I wanted; after all, they were on TV.
It’s a paradox: Shouldn’t the most accomplished be well equipped to make choices that maximize life satisfaction?
There are three things, once one’s basic needs are satisfied, that academic literature points to as the ingredients for happiness: having meaningful social relationships, being good at whatever it is one spends one’s days doing, and having the freedom to make life decisions independently.
But research into happiness has also yielded something a little less obvious: Being better educated, richer, or more accomplished doesn’t do much to predict whether someone will be happy. In fact, it might mean someone is less likely to be satisfied with life.
That second finding is the puzzle that Raj Raghunathan, a professor of marketing at The University of Texas at Austin’s McCombs School of Business, tries to make sense of in his recent book, If You’re So Smart, Why Aren’t You Happy?Raghunathan’s writing does fall under the category of self-help (with all of the pep talks and progress worksheets that that entails), but his commitment to scientific research serves as ballast for the genre’s more glib tendencies.
A professor of cognitive science argues that the world is nothing like the one we experience through our senses.
As we go about our daily lives, we tend to assume that our perceptions—sights, sounds, textures, tastes—are an accurate portrayal of the real world. Sure, when we stop and think about it—or when we find ourselves fooled by a perceptual illusion—we realize with a jolt that what we perceive is never the world directly, but rather our brain’s best guess at what that world is like, a kind of internal simulation of an external reality. Still, we bank on the fact that our simulation is a reasonably decent one. If it wasn’t, wouldn’t evolution have weeded us out by now? The true reality might be forever beyond our reach, but surely our senses give us at least an inkling of what it’s really like.
The president’s unique approach to the White House Correspondents’ Dinner will surely be missed.
No U.S. President has been a better comedian than Barack Obama. It’s really that simple.
Now that doesn’t mean that some modern-day presidents couldn’t tell a joke. John F. Kennedy, Ronald Reagan, and Bill Clinton excelled at it. But Obama has transformed the way presidents use comedy—not just engaging in self-deprecation or playfully teasing his rivals, but turning his barbed wit on his opponents.
He puts that approach on display every year at the White House Correspondents’ Dinner. This annual tradition, which began in 1921 when 50 journalists (all men) gathered in Washington D.C., has become a showcase for each president’s comedy chops. Some presidents have been bad, some have been good. Obama has been the best. He’s truly the killer comedian in chief.
The U.S. president talks through his hardest decisions about America’s role in the world.
Friday, August 30, 2013, the day the feckless Barack Obama brought to a premature end America’s reign as the world’s sole indispensable superpower—or, alternatively, the day the sagacious Barack Obama peered into the Middle Eastern abyss and stepped back from the consuming void—began with a thundering speech given on Obama’s behalf by his secretary of state, John Kerry, in Washington, D.C. The subject of Kerry’s uncharacteristically Churchillian remarks, delivered in the Treaty Room at the State Department, was the gassing of civilians by the president of Syria, Bashar al-Assad.
“A typical person is more than five times as likely to die in an extinction event as in a car crash,” says a new report.
Nuclear war. Climate change. Pandemics that kill tens of millions.
These are the most viable threats to globally organized civilization. They’re the stuff of nightmares and blockbusters—but unlike sea monsters or zombie viruses, they’re real, part of the calculus that political leaders consider everyday. And according to a new report from the U.K.-based Global Challenges Foundation, they’re much more likely than we might think.
In its annual report on “global catastrophic risk,” the nonprofit debuted a startling statistic: Across the span of their lives, the average American is more than five times likelier to die during a human-extinction event than in a car crash.
Partly that’s because the average person will probably not die in an automobile accident. Every year, one in 9,395 people die in a crash; that translates to about a 0.01 percent chance per year. But that chance compounds over the course of a lifetime. At life-long scales, one in 120 Americans die in an accident.
Nearly half of Americans would have trouble finding $400 to pay for an emergency. I’m one of them.
Since 2013,the Federal Reserve Board has conducted a survey to “monitor the financial and economic status of American consumers.” Most of the data in the latest survey, frankly, are less than earth-shattering: 49 percent of part-time workers would prefer to work more hours at their current wage; 29 percent of Americans expect to earn a higher income in the coming year; 43 percent of homeowners who have owned their home for at least a year believe its value has increased. But the answer to one question was astonishing. The Fed asked respondents how they would pay for a $400 emergency. The answer: 47 percent of respondents said that either they would cover the expense by borrowing or selling something, or they would not be able to come up with the $400 at all. Four hundred dollars! Who knew?
One Nashville pastor has a plan to help those without permanent shelter: building 60-square-foot houses with no bathroom, kitchen, or electricity.
NASHVILLE—Around the time that Vanderbilt University released the results of alarge-scale study outlining the most effective solutions to homelessness, Pastor Jeff Obafemi Carr was moving into a 60-square-foot house with no bathroom, kitchen, or even a sink. Carr’s idea was to temporarily leave behind his wife and five kids to live in the tiny house, which looks like a tool shed, to raise $50,000 to build more such homes for the homeless.
After two months living in the home, Carr had raised $66,967—enough to build six. The buildings are now set up, on wheels, in the backyard of the Green Street Church on Nashville’s east side, part of a sanctuary that also houses homeless people living in tents who moved from anencampment in one of Nashville’s parks that recently closed.
Two scholars discuss the ups and downs of life as a right-leaning professor.
“I don’t think I can say it too strongly, but literally it just changed my life,” said a scholar, about reading the work of Ayn Rand. “It was like this awakening for me.”
Different versions of this comment appear throughout Jon A. Shields and Joshua M. Dunn Sr.’s book on conservative professors, Passing on the Right, usually about people like Milton Friedman and John Stuart Mill and Friedrich Hayek. The scholars they interviewed speak in a dreamy way about these nerdy celebrities, perhaps imagining an alternate academic universe—one where social scientists can be freely conservative.
The assumption that most college campuses lean left is so widespread in American culture that it has almost become a caricature: intellectuals in thick-rimmed glasses preaching Marxism on idyllic grassy quads; students protesting minor infractions against political correctness; raging professors trying to prove that God is, in fact, dead. Studies about professors’ political beliefs and voting behavior suggest this assumption is at least somewhat correct. But Shields and Dunn set out to investigate a more nuanced question: For the minority of professors who are cultural and political conservatives, what’s life actually like?
...isn't something that can be done on campus. It's an internship.
When I was 17, if you asked me how I planned on getting a job in the future, I think I would have said: Get into the right college. When I was 18, if you asked me the same question, I would have said: Get into the right classes. When I was 19: Get good grades.
But when employers recently named the most important elements in hiring a recent graduate, college reputation, GPA, and courses finished at the bottom of the list. At the top, according to the Chronicle of Higher Education, were experiences outside of academics: Internships, jobs, volunteering, and extracurriculars.
What Employers Want
"When employers do hire from college, the evidence suggests that academic skills are not their primary concern," says Peter Cappelli, a Wharton professor and the author of a new paper on job skills. "Work experience is the crucial attribute that employers want even for students who have yet to work full-time."