One hundred years ago this month, two intrepid explorers returned from the Arctic reaches and declared that they had reached the North Pole. Not together, but on competing expeditions to become the first person and team to the Pole. Robert E. Peary led one expedition, and Frederick A. Cook led the other. And each declared the other's claim to the Pole untrue.
Today, of course, that kind of controversy could be settled far more easily. At the very least, we would expect a GPS track record showing that the Pole had been reached, and airborne photographs or other corroborating evidence might be required, as well. Without that technology, however, the claims were a little harder to confirm. It's not like there was an exact marker at the spot, because nobody had been there before. And unlike the peak of Mt. Everest, the landscape at the precise location of the North Pole doesn't look distinctly different from the rest of the terrain--for hundreds of miles in any given direction.
So the controversy has raged for a full century. But here's the interesting part. As more data about the expeditions, and about the North Pole, have emerged, it seems more and more likely that neither man actually reached the Pole. As John Tierney wrote recently in the Science Times, Peary supposedly took no celestial navigation readings on his final push to the Pole, until one day he took a single reading, looked very disappointed, and then declared that the observation--which he showed to no one--confirmed that he'd arrived at the North Pole, exactly. Cook had neither a trained celestial navigator nor the skill to make the observations himself. Without that skill, how on earth (so to speak) could he have reached the Pole, or known precisely when he was there? The modern-day consensus, according to Tierney, is that Peary got closer than Cook, but that neither man got closer than perhaps 100 miles away.
Yet a full century and much more advanced data analysis and evidence later, Peary and Cook still have ardent supporters who adamantly believe that their hero told the truth. They suggest that it might have been possible for either explorer to have found the Pole without clear celestial sightings, by studying wind patterns in the snow, or observing shadows, or even by compass, even though a compass needle gets extremely erratic near the Earth's poles. Apparently, some of the Peary/Cook advocates are more comfortable with contorted logic than simply acknowledging that, given more data, it appears their initial impression of things was ... ummm ... wrong.
Peary and Cook are not the only explorers to have die-hard believers who have clung to a set vision of their heroes' lives despite the emergence of countering evidence. David Roberts, an editor at National Geographic Adventure, encountered a startling backlash of anger and even threats after writing a feature article last spring (which he's expanded into a soon-to-be-released book) that solved the mystery of a young adventurer's disappearance--but not the way some of the adventurer's admirers wanted it solved.
In 1934, at the age of 20, Everett Reuss left civilization to go live in the wilderness ... and was never heard from again. A whole folk myth movement sprang up around this young man who seemed to have slipped so completely into the wild that he eluded discovery for the rest of his life. An annual art festival in Escalante, Utah, is even named in his honor. But Roberts, who researched the case for 10 years, finally discovered evidence that Ruess had been murdered by two members of the Ute tribe almost as soon as he'd begun his journey. There was a witness to the murder, an unearthed skeleton, and DNA tests that were compatible with other family members.
The mystery, it seemed, had been solved. But the hue and cry surrounding Roberts' piece was both angry and loud, catching both Roberts and the Reuss family by surprise. "We all want our heroes to succeed," Reuss' nephew Brian surmised, in an attempt to explain the uproar. (A couple months ago, I wrote a longer essay about the Reuss controversy.)
Perhaps. But I now think there's more to the equation; tendencies that affect how we view information about not just heroes and adventurers, but also issues and events that affect local and national policy and action.
How is it that people can cling to an opinion or view of a person, event, issue of the world, despite being presented with clear or mounting data that contradicts that position? The easy answer, of course, is simply that people are irrational. But a closer look at some of the particular ways and reasons we're irrational offers some interesting food for thought.
In a recently published study, a group of researchers from Northwestern University, UNC Chapel HIll, SUNY Buffalo and Millsaps College found that people often employ an approach the researchers called "motivated reasoning" when sorting through new information or arguments, especially on controversial issues. Motivated reasoning is, as UCLA public policy professor Mark Kleiman put it, the equivalent of policy-driven data, instead of data-driven policy.
In other words, if people start with a particular opinion or view on a subject, any counter-evidence can create "cognitive dissonance"--discomfort caused by the presence of two irreconcilable ideas in the mind at once. One way of resolving the dissonance would be to change or alter the originally held opinion. But the researchers found that many people instead choose to change the conflicting evidence--selectively seeking out information or arguments that support their position while arguing around or ignoring any opposing evidence, even if that means using questionable or contorted logic.
That's not a news flash to anyone who's paid attention to any recent national debate--although the researchers pointed out that this finding, itself, runs counter to the idea that the reason people continue to hold positions counter to all evidence is because of misinformation or lack of access to the correct data. Even when presented with compelling, factual data from sources they trusted, many of the subjects still found ways to dismiss it. But the most interesting (or disturbing) aspect of the Northwestern study was the finding that providing additional counter-evidence, facts, or arguments actually intensified this reaction. Additional countering data, it seems, increases the cognitive dissonance, and therefore the need for subjects to alleviate that discomfort by retreating into more rigidly selective hearing and entrenched positions.
Needless to say, these findings do not bode well for anyone with hopes of changing anyone else's mind with facts or rational discussion, especially on "hot button" issues. But why do we cling so fiercely to positions when they don't even involve us directly? Why do we care who got to the North Pole first? Or whether a particular bill has provision X versus provision Y in it? Why don't we care more about simply finding out the truth--especially in cases where one "right" answer actually exists?
Part of the reason, according to Kleiman, is "the brute fact that people identify their opinions with themselves; to admit having been wrong is to have lost the argument, and (as Vince Lombardi said), every time you lose, you die a little." And, he adds, "there is no more destructive force in human affairs--not greed, not hatred--than the desire to have been right."
So, what do we do about that? If overcoming "the desire to have been right" is half as challenging as overcoming hate or greed, the outlook doesn't seem promising. But Kleiman, who specializes in crime control policy and alternative solutions to very sticky problems (his latest book is "When Brute Force Fails: How to Have Less Crime and Less Punishment"), thinks all is not lost. He points to the philosopher Karl Popper, who, he says, believed fiercely in the discipline and teaching of critical thinking, because "it allows us to offer up our opinions as a sacrifice, so that they die in our stead."
A liberal education, Kleiman says, "ought, above all, to be an education in non-attachment to one's current opinions. I would define a true intellectual as one who cares terribly about being right, and not at all about having been right." Easy to say, very hard to achieve. For all sorts of reasons. But it's worth thinking about. Even if it came at the cost of sacrificing or altering our most dearly-held opinions ... the truth might set us free.
Photo Credit: Flickr User Lanz, photolib.noaa.gov, Wikimedia Commons
Ben Stiller’s follow-up to his own comedy classic is a downright bummer, no matter how many celebrity cameos it tries to cram in.
You don’t need to go to the theater to get the full experience of Zoolander 2. Simply get your hands on a copy of the original, watch it, and then yell a bunch of unfunny topical lines every time somebody tells a joke. That’s how it feels to watch Ben Stiller’s sequel to his 2001 spoof of the fashion industry: Zoolander 2 takes pains to reference every successful gag you remember from the original, and then embellish them in painful—often offensive, almost always outdated—fashion. It’s a film that has no real reason to exist, and it spends its entire running time reaffirming that fact.
The original Zoolander, to be fair, had no business being as funny as it was—it made fun of an industry that already seems to exist in a constant state of self-parody, and much of its humor relied on simple malapropisms and sight gags. But it was hilarious anyway as a candid snapshot of the fizzling-out of ’90s culture. Like almost any zeitgeist comedy, it belonged to a particular moment—and boy, should it have stayed there. With Zoolander 2, Stiller (who directed, co-wrote, and stars) tries to recapture the magic of 2001 by referencing its past glories with increasing desperation, perhaps to avoid the fact that he has nothing new to say about the fashion industry or celebrity culture 15 years laters.
Today’s empires are born on the web, and exert tremendous power in the material world.
Mark Zuckerberg hasn’t had the best week.
First, Facebook’s Free Basics platform was effectively banned in India. Then, a high-profile member of Facebook’s board of directors, the venture capitalist Marc Andreessen, sounded off about the decision to his nearly half-a-million Twitter followers with a stunning comment.
“Anti-colonialism has been economically catastrophic for the Indian people for decades,” Andreessen wrote. “Why stop now?”
After that, the Internet went nuts.
Andreessen deleted his tweet, apologized, and underscored that he is “100 percent opposed to colonialism” and “100 percent in favor of independence and freedom.” Zuckerberg, Facebook’s CEO, followed up with his own Facebook post to say Andreessen’s comment was “deeply upsetting” to him, and not representative of the way he thinks “at all.”
Most people know how to help someone with a cut or a scrape. But what about a panic attack?
Here’s a thought experiment: You’re walking down the street with a friend when your companion falls and gashes her leg on the concrete. It’s bleeding; she’s in pain. It’s clear she’s going to need stitches. What do you do?
This one isn’t exactly a head-scratcher. You'd probably attempt to offer some sort of first-aid assistance until the bleeding stopped, or until she could get to medical help. Maybe you happen to have a Band-Aid on you, or a tissue to help her clean the wound, or a water bottle she can use to rinse it off. Maybe you pick her up and help her hobble towards transportation, or take her where she needs to go.
Here’s a harder one: What if, instead of an injured leg, that same friend has a panic attack?
Einstein’s gravitational waves rest on a genuinely radical idea.
After decades of anticipation, we have directly detected gravitational waves—ripples in spacetime traveling at the speed of light through the universe. Scientists at LIGO (the Laser Interferometic Gravitational-wave Observatory) have announced that they have measured waves coming from the inspiral of two massive black holes, providing a spectacular confirmation of Albert Einstein’s general theory of relativity, whose hundredth anniversary was celebrated just last year.
Finding gravitational waves indicates that Einstein was (once again) right, and opens a new window onto energetic events occurring around the universe. But there’s a deeper lesson, as well: a reminder of the central importance of locality, an idea that underlies much of modern physics.
The bureau successfully played the long game in both cases.
The story of law enforcement in the Oregon standoff is one of patience.
On the most obvious level, that was reflected in the 41 days that armed militia members occupied the Malheur National Wildlife Refuge near Burns. It took 25 days before the FBI and state police moved to arrest several leaders of the occupation and to barricade the refuge. It took another 15 days before the last of the final occupiers walked out, Thursday morning Oregon time.
Each of those cases involved patience as well: Officers massed on Highway 395 didn’t shoot LaVoy Finicum when he tried to ram past a barricade, nearly striking an FBI agent, though when he reached for a gun in his pocket they finally fired. Meanwhile, despite increasingly hysterical behavior from David Fry, the final occupier, officers waited him out until he emerged peacefully.
The revolution that ended the reign of beards occurred on September 30, 331 b.c., as Alexander the Great prepared for a decisive showdown with the Persian emperor for control of Asia. On that day, he ordered his men to shave. Yet from time immemorial in Greek culture, a smooth chin on a grown man had been taken as a sign of effeminacy or degeneracy. What can explain this unprecedented command? When the commander Parmenio asked the reason, according to the ancient historian Plutarch, Alexander replied, “Don’t you know that in battles there is nothing handier to grasp than a beard?” But there is ample cause to doubt Plutarch’s explanation. Stories of beard-pulling in battles were myth rather than history. Plutarch and later historians misunderstood the order because they neglected the most relevant fact, namely that Alexander had dared to do what no self-respecting Greek leader had ever done before: shave his face, likening himself to the demigod Heracles, rendered in painting and sculpture in the immortal splendor of youthful, beardless nudity. Alexander wished above all, as he told his generals before the battle, that each man would see himself as a crucial part of the mission. They would certainly see this more clearly if each of them looked more like their heroic commander.
The country’s growth is slowing. The wrong response might make the problem worse.
An anxious superpower is confounded by a troubled economy. For a generation, its growth has been envied; now that growth is decelerating sharply. For decades, it has shaped and guided its economy via tight control of its banks; now that lever is malfunctioning. For years, it has carefully managed its exchange rate and limited the flow of capital across its borders; now the dam is cracking. To anyone who keeps up with the news, the superpower would seem easy to identify: China. But for those with a long memory, it could just as well be the United States of the Nixon era.
Like China today, the United States of the 1970s experienced an abrupt economic slowdown. Its economy had expanded by 4.4 percent a year, on average, during the go-go ’50s and ’60s, but growth slowed by about one-quarter during the following decade, to 3.2 percent a year. Even though growth of more than 3 percent may sound robust by today’s standards, at the time it felt ghastly. Time magazine lamented in 1974 that “middle-class people are being pushed into such demeaning economies as buying clothes at rummage sales”; a year or so later, its cover asked, “Can Capitalism Survive?” In September 1975, after President Gerald Ford survived two attempts on his life in quick succession, an adviser named Alan Greenspan responded with a memo about the “nihilism, radicalism, and violence” that seemed to grip some Americans. When New York City flirted with bankruptcy, its plight was taken as a symbol of broader moral and cultural decay.
Jim Gilmore joins Chris Christie and Carly Fiorina, and leaves the race after a poor showing in New Hampshire.
Jim Gilmore’s candidacy this year was improbable—but even more improbable was the minor cult of personality that developed around it.
The former Virginia governor never had a chance. Not, like, in the sense of Lindsey Graham, a candidate with national standing but no path to the presidency. More in the George Pataki sense: a guy who had no real business in race, but was running anyway. Except that Gilmore made Pataki look like a juggernaut. Also, Pataki saw the writing on the wall and had the sense to drop out in late December. Gilmore soldiered on, and ended up as the last of the truly longshots to leave.
The result was that Gilmore turned into a sort of folk hero. Not for voters, mind you—he managed only 12 votes in Iowa and 125 in New Hampshire, and his campaign was funded largely by loans from himself. Because of his low support in the polls, Gilmore only made the cut for the very first kid’s-table debate in August, and then again for the undercard in late January. Other than that, he was shut out completely.
By mining electronic medical records, scientists show the lasting legacy of prehistoric sex on modern humans’ health.
Modern humans originated in Africa, and started spreading around the world about 60,000 years ago. As they entered Asia and Europe, they encountered other groups of ancient humans that had already settled in these regions, such as Neanderthals. And sometimes, when these groups met, they had sex.
We know about these prehistoric liaisons because they left permanent marks on our genome. Even though Neanderthals are now extinct, every living person outside of Africa can trace between 1 and 5 percent of our DNA back to them. (I am 2.6 percent Neanderthal, if you were wondering, which pales in comparison to my colleague James Fallows at 5 percent.)
This lasting legacy was revealed in 2010 when the complete Neanderthal genome was published. Since then, researchers have been trying to figure out what, if anything, the Neanderthal sequences are doing in our own genome. Are they just passive hitchhikers, or did they bestow important adaptations on early humans? And are they affecting the health of modern ones?
When four American women were murdered during El Salvador’s dirty war, a young U.S. official and his unlikely partner risked their lives to solve the case.
On December 1, 1980, two American Catholic churchwomen—an Ursuline nun and a lay missionary—sat down to dinner with Robert White, the U.S. ambassador to El Salvador. They worked in rural areas ministering to El Salvador’s desperately impoverished peasants, and White admired their commitment and courage. The talk turned to the government’s brutal tactics for fighting the country’s left-wing guerrillas, in a dirty war waged by death squads that dumped bodies in the streets and an army that massacred civilians. The women were alarmed by the incoming Reagan administration’s plans for a closer relationship with the military-led government. Because of a curfew, the women spent the night at the ambassador’s residence. The next day, after breakfast with the ambassador’s wife, they drove to San Salvador’s international airport to pick up two colleagues who were flying back from a conference in Nicaragua. Within hours, all four women would be dead.