One hundred years ago this month, two intrepid explorers returned from the Arctic reaches and declared that they had reached the North Pole. Not together, but on competing expeditions to become the first person and team to the Pole. Robert E. Peary led one expedition, and Frederick A. Cook led the other. And each declared the other's claim to the Pole untrue.
Today, of course, that kind of controversy could be settled far more easily. At the very least, we would expect a GPS track record showing that the Pole had been reached, and airborne photographs or other corroborating evidence might be required, as well. Without that technology, however, the claims were a little harder to confirm. It's not like there was an exact marker at the spot, because nobody had been there before. And unlike the peak of Mt. Everest, the landscape at the precise location of the North Pole doesn't look distinctly different from the rest of the terrain--for hundreds of miles in any given direction.
So the controversy has raged for a full century. But here's the interesting part. As more data about the expeditions, and about the North Pole, have emerged, it seems more and more likely that neither man actually reached the Pole. As John Tierney wrote recently in the Science Times, Peary supposedly took no celestial navigation readings on his final push to the Pole, until one day he took a single reading, looked very disappointed, and then declared that the observation--which he showed to no one--confirmed that he'd arrived at the North Pole, exactly. Cook had neither a trained celestial navigator nor the skill to make the observations himself. Without that skill, how on earth (so to speak) could he have reached the Pole, or known precisely when he was there? The modern-day consensus, according to Tierney, is that Peary got closer than Cook, but that neither man got closer than perhaps 100 miles away.
Yet a full century and much more advanced data analysis and evidence later, Peary and Cook still have ardent supporters who adamantly believe that their hero told the truth. They suggest that it might have been possible for either explorer to have found the Pole without clear celestial sightings, by studying wind patterns in the snow, or observing shadows, or even by compass, even though a compass needle gets extremely erratic near the Earth's poles. Apparently, some of the Peary/Cook advocates are more comfortable with contorted logic than simply acknowledging that, given more data, it appears their initial impression of things was ... ummm ... wrong.
Peary and Cook are not the only explorers to have die-hard believers who have clung to a set vision of their heroes' lives despite the emergence of countering evidence. David Roberts, an editor at National Geographic Adventure, encountered a startling backlash of anger and even threats after writing a feature article last spring (which he's expanded into a soon-to-be-released book) that solved the mystery of a young adventurer's disappearance--but not the way some of the adventurer's admirers wanted it solved.
In 1934, at the age of 20, Everett Reuss left civilization to go live in the wilderness ... and was never heard from again. A whole folk myth movement sprang up around this young man who seemed to have slipped so completely into the wild that he eluded discovery for the rest of his life. An annual art festival in Escalante, Utah, is even named in his honor. But Roberts, who researched the case for 10 years, finally discovered evidence that Ruess had been murdered by two members of the Ute tribe almost as soon as he'd begun his journey. There was a witness to the murder, an unearthed skeleton, and DNA tests that were compatible with other family members.
The mystery, it seemed, had been solved. But the hue and cry surrounding Roberts' piece was both angry and loud, catching both Roberts and the Reuss family by surprise. "We all want our heroes to succeed," Reuss' nephew Brian surmised, in an attempt to explain the uproar. (A couple months ago, I wrote a longer essay about the Reuss controversy.)
Perhaps. But I now think there's more to the equation; tendencies that affect how we view information about not just heroes and adventurers, but also issues and events that affect local and national policy and action.
How is it that people can cling to an opinion or view of a person, event, issue of the world, despite being presented with clear or mounting data that contradicts that position? The easy answer, of course, is simply that people are irrational. But a closer look at some of the particular ways and reasons we're irrational offers some interesting food for thought.
In a recently published study, a group of researchers from Northwestern University, UNC Chapel HIll, SUNY Buffalo and Millsaps College found that people often employ an approach the researchers called "motivated reasoning" when sorting through new information or arguments, especially on controversial issues. Motivated reasoning is, as UCLA public policy professor Mark Kleiman put it, the equivalent of policy-driven data, instead of data-driven policy.
In other words, if people start with a particular opinion or view on a subject, any counter-evidence can create "cognitive dissonance"--discomfort caused by the presence of two irreconcilable ideas in the mind at once. One way of resolving the dissonance would be to change or alter the originally held opinion. But the researchers found that many people instead choose to change the conflicting evidence--selectively seeking out information or arguments that support their position while arguing around or ignoring any opposing evidence, even if that means using questionable or contorted logic.
That's not a news flash to anyone who's paid attention to any recent national debate--although the researchers pointed out that this finding, itself, runs counter to the idea that the reason people continue to hold positions counter to all evidence is because of misinformation or lack of access to the correct data. Even when presented with compelling, factual data from sources they trusted, many of the subjects still found ways to dismiss it. But the most interesting (or disturbing) aspect of the Northwestern study was the finding that providing additional counter-evidence, facts, or arguments actually intensified this reaction. Additional countering data, it seems, increases the cognitive dissonance, and therefore the need for subjects to alleviate that discomfort by retreating into more rigidly selective hearing and entrenched positions.
Needless to say, these findings do not bode well for anyone with hopes of changing anyone else's mind with facts or rational discussion, especially on "hot button" issues. But why do we cling so fiercely to positions when they don't even involve us directly? Why do we care who got to the North Pole first? Or whether a particular bill has provision X versus provision Y in it? Why don't we care more about simply finding out the truth--especially in cases where one "right" answer actually exists?
Part of the reason, according to Kleiman, is "the brute fact that people identify their opinions with themselves; to admit having been wrong is to have lost the argument, and (as Vince Lombardi said), every time you lose, you die a little." And, he adds, "there is no more destructive force in human affairs--not greed, not hatred--than the desire to have been right."
So, what do we do about that? If overcoming "the desire to have been right" is half as challenging as overcoming hate or greed, the outlook doesn't seem promising. But Kleiman, who specializes in crime control policy and alternative solutions to very sticky problems (his latest book is "When Brute Force Fails: How to Have Less Crime and Less Punishment"), thinks all is not lost. He points to the philosopher Karl Popper, who, he says, believed fiercely in the discipline and teaching of critical thinking, because "it allows us to offer up our opinions as a sacrifice, so that they die in our stead."
A liberal education, Kleiman says, "ought, above all, to be an education in non-attachment to one's current opinions. I would define a true intellectual as one who cares terribly about being right, and not at all about having been right." Easy to say, very hard to achieve. For all sorts of reasons. But it's worth thinking about. Even if it came at the cost of sacrificing or altering our most dearly-held opinions ... the truth might set us free.
Photo Credit: Flickr User Lanz, photolib.noaa.gov, Wikimedia Commons
Some researchers believe that the microbiome may play a role in regulating how people think and feel.
By now, the idea that gut bacteria affects a person’s health is not revolutionary. Many people know that these microbes influence digestion, allergies, and metabolism. The trend has become almost commonplace: New books appear regularly detailing precisely which diet will lead to optimum bacterial health.
But these microbes’ reach may extend much further, into the human brains. A growing group of researchers around the world are investigating how the microbiome, as this bacterial ecosystem is known, regulates how people think and feel. Scientists have found evidence that this assemblage—about a thousand different species of bacteria, trillions of cells that together weigh between one and three pounds—could play a crucial role in autism, anxiety, depression, and other disorders.
In the 1970s, a new wave of post-Watergate liberals stopped fighting monopoly power. The result is an increasingly dangerous political system.
It was January 1975, and the Watergate Babies had arrived in Washington looking for blood. The Watergate Babies—as the recently elected Democratic congressmen were known—were young, idealistic liberals who had been swept into office on a promise to clean up government, end the war in Vietnam, and rid the nation’s capital of the kind of corruption and dirty politics the Nixon White House had wrought. Richard Nixon himself had resigned just a few months earlier in August. But the Watergate Babies didn’t just campaign against Nixon; they took on the Democratic establishment, too. Newly elected Representative George Miller of California, then just 29 years old, announced, “We came here to take the Bastille.”
Tom Hanks’s Doug has a lot in common with “Black Jeopardy” contestants—except, of course, for politics.
SNL’s ongoing “Black Jeopardy” series has been, in part, about divisions. In each edition, black American contestants answer Kenan Thompson’s clues with in-jokes, slang, and their shared opinions while an outsider—say, Elizabeth Banks as the living incarnation of Becky, Louis C.K. as a BYU African American Studies professor, or Drake as a black Canadian—just show their cluelessness.
When Tom Hanks showed up in a “Make America Great Again” hat and bald-eagle shirt to play the contestant “Doug” this weekend, it seemed like the set-up for the ugliest culture clash yet. The 2016 election has been a reminder of the country’s profound racial fault lines, and SNL hasn’t exactly been forgiving toward the Republican nominee on that front: Its version of Trump hasn’t been able to tell black people apart, and it aired a mock ad painting his supporters as white supremacists—which, inarguably, some of them really are.
Just why was Tom Hanks dancing in a black-and-orange suit on Saturday Night Live so funny?
This weekend’s episode of Saturday Night Live offered a mini masterpiece: a gloriously silly Halloween-themed piece revolving around a “Haunted Elevator” ride and its unusual star attraction. Beck Bennett and Kate McKinnon played a couple looking for spooky thrills who instead found something far more bewildering: a pumpkin-suited man who would randomly appear alongside two cheerful skeletons and perform a dance routine. “Who are you?” asked a frustrated Bennett after the man (played by Tom Hanks) appeared for the second time. “I’m David Pumpkins!” came the reply.
McKinnon followed up: “Yeah, and David Pumpkins is … ?”
Why cultures that value interdependence, like Japan, win at being deep
Think of the last piece of big news you got. How did you feel about it? Happy? Sad? Angry? Worried? Excited? Grateful? A little bit of all of the above? Experiencing multiple emotions at once may make it seem like you don’t actually know just how you feel about something—that you’re ambivalent, or indecisive, or wishy-washy. Psychologists would say it just means you’re emotionally complex. And according to a new study published in the Journal of Personality and Social Psychology, emotional complexity varies a lot between countries.
There are two definitions of emotional complexity that researchers tend to use. One is called “emotional dialecticism,” which just means feeling positive and negative emotions at the same time. The other is “emotional differentiation,” which is when someone is able to separate out and describe the discrete emotions they’re feeling.
Tristan Harris believes Silicon Valley is addicting us to our phones. He’s determined to make it stop.
On a recent evening in San Francisco, Tristan Harris, a former product philosopher at Google, took a name tag from a man in pajamas called “Honey Bear” and wrote down his pseudonym for the night: “Presence.”
Harris had just arrived at Unplug SF, a “digital detox experiment” held in honor of the National Day of Unplugging, and the organizers had banned real names. Also outlawed: clocks, “w-talk” (work talk), and “WMDs” (the planners’ loaded shorthand for wireless mobile devices). Harris, a slight 32-year-old with copper hair and a tidy beard, surrendered his iPhone, a device he considers so addictive that he’s called it “a slot machine in my pocket.” He keeps the background set to an image of Scrabble tiles spelling out the words face down, a reminder of the device’s optimal position.
Trump supporters are convinced Democrats are using “oversampling” to stuff the polls in Hillary Clinton’s favor. But they’re just wrong about statistics.
Late last night, pro-Trump Twitter lit up with excited chatter. Donald Trump is falling fast in the polls, sliding through a month-long decline most statisticians would say is a result of him being, you know, unpopular. (And maybe this. Or this. Or this.) But one blogger had another theory: Polling organizations are deliberately interviewing more Democrats to skew the surveys toward Hillary Clinton.
This afternoon, Trump threw his support behind the idea. “When the polls are even, when they leave them alone and do them properly, I’m leading,” he said at a rally in Florida. “But you see these polls where they’re polling Democrats. How’s Trump doing? Oh, he’s down. They’re polling Democrats. The system is corrupt and it’s rigged and it’s broken.”
What use is there today for one of the oldest virtues?
As many Americans go about their days, I imagine they have two little angels perched on their shoulders, whispering conflicting messages about happiness and material wealth. One angel is embodied by James Altucher, a minimalist self-help guru recently profiled by The New York Times. Altucher claims to have only 15 possessions, after having unburdened himself a few months ago of 40 garbage bags’ worth of stuff and never looking back. As I read about Altucher, I rolled the numbers 15 and 40 over in my mind, thinking about the belongings in my bedroom and the garbage bags under my kitchen sink.
The other angel is Tyler Brûlé, the editor in chief of the fantastically high-end lifestyle magazine Monocle and a columnist for the Financial Times. He is the sort of writer who tosses off such lines as “I zipped along the autostrada through the Val d’Aosta with the ever-trusty Mario (my Italian driver for the past 20 years) at the wheel” with little regard for how privileged and pretentious he sounds (especially in his superfluous parentheticals). Still, there is something, I’m a little ashamed to say, that I envy about Brûlé’s effortless cosmopolitanism—which, it’s hard to miss, is only made possible by unusual wealth.
Biology textbooks tell us that lichens are alliances between two organisms—a fungus and an alga. They are wrong.
In 1995, if you had told Toby Spribille that he’d eventually overthrow a scientific idea that’s been the stuff of textbooks for 150 years, he would have laughed at you. Back then, his life seemed constrained to a very different path. He was raised in a Montana trailer park, and home-schooled by what he now describes as a “fundamentalist cult.” At a young age, he fell in love with science, but had no way of feeding that love. He longed to break away from his roots and get a proper education.
At 19, he got a job at a local forestry service. Within a few years, he had earned enough to leave home. His meager savings and non-existent grades meant that no American university would take him, so Spribille looked to Europe.
Washington's zeal for humanitarian action ebbs and flows. And many are dying as a result.
To revisit the U.N.’s anointing of Aleppo as a World Heritage Site is a haunting exercise. The U.N. celebrated the city’s “13th-century citadel, 12th-century Great Mosque and various 17th-century madrasas, palaces, caravanserais and hammams,” all of which constituted “the city’s cohesive, unique urban fabric.” This Aleppo, after five years of brutal war, is a place now dead and buried.
The war has turned ordinary Syrians into flotsam and jetsam, lost amid national forces beyond their control, including the brutal dictator Bashar al-Assad, extremist groups such as the Islamic State, and regional actors like Iran, Hezbollah, Turkey, Saudi Arabia, and Russia. But if the civilians were hoping for Western action to stop the bleeding, they have fallen prey to another set of dynamics they can’t govern or even necessarily understand. Historically, Washington’s zeal for intervention in humanitarian crises follows a cycle. And the Syrians, unfortunately, are dying during the wrong phase.