One hundred years ago this month, two intrepid explorers returned from the Arctic reaches and declared that they had reached the North Pole. Not together, but on competing expeditions to become the first person and team to the Pole. Robert E. Peary led one expedition, and Frederick A. Cook led the other. And each declared the other's claim to the Pole untrue.
Today, of course, that kind of controversy could be settled far more easily. At the very least, we would expect a GPS track record showing that the Pole had been reached, and airborne photographs or other corroborating evidence might be required, as well. Without that technology, however, the claims were a little harder to confirm. It's not like there was an exact marker at the spot, because nobody had been there before. And unlike the peak of Mt. Everest, the landscape at the precise location of the North Pole doesn't look distinctly different from the rest of the terrain--for hundreds of miles in any given direction.
So the controversy has raged for a full century. But here's the interesting part. As more data about the expeditions, and about the North Pole, have emerged, it seems more and more likely that neither man actually reached the Pole. As John Tierney wrote recently in the Science Times, Peary supposedly took no celestial navigation readings on his final push to the Pole, until one day he took a single reading, looked very disappointed, and then declared that the observation--which he showed to no one--confirmed that he'd arrived at the North Pole, exactly. Cook had neither a trained celestial navigator nor the skill to make the observations himself. Without that skill, how on earth (so to speak) could he have reached the Pole, or known precisely when he was there? The modern-day consensus, according to Tierney, is that Peary got closer than Cook, but that neither man got closer than perhaps 100 miles away.
Yet a full century and much more advanced data analysis and evidence later, Peary and Cook still have ardent supporters who adamantly believe that their hero told the truth. They suggest that it might have been possible for either explorer to have found the Pole without clear celestial sightings, by studying wind patterns in the snow, or observing shadows, or even by compass, even though a compass needle gets extremely erratic near the Earth's poles. Apparently, some of the Peary/Cook advocates are more comfortable with contorted logic than simply acknowledging that, given more data, it appears their initial impression of things was ... ummm ... wrong.
Peary and Cook are not the only explorers to have die-hard believers who have clung to a set vision of their heroes' lives despite the emergence of countering evidence. David Roberts, an editor at National Geographic Adventure, encountered a startling backlash of anger and even threats after writing a feature article last spring (which he's expanded into a soon-to-be-released book) that solved the mystery of a young adventurer's disappearance--but not the way some of the adventurer's admirers wanted it solved.
In 1934, at the age of 20, Everett Reuss left civilization to go live in the wilderness ... and was never heard from again. A whole folk myth movement sprang up around this young man who seemed to have slipped so completely into the wild that he eluded discovery for the rest of his life. An annual art festival in Escalante, Utah, is even named in his honor. But Roberts, who researched the case for 10 years, finally discovered evidence that Ruess had been murdered by two members of the Ute tribe almost as soon as he'd begun his journey. There was a witness to the murder, an unearthed skeleton, and DNA tests that were compatible with other family members.
The mystery, it seemed, had been solved. But the hue and cry surrounding Roberts' piece was both angry and loud, catching both Roberts and the Reuss family by surprise. "We all want our heroes to succeed," Reuss' nephew Brian surmised, in an attempt to explain the uproar. (A couple months ago, I wrote a longer essay about the Reuss controversy.)
Perhaps. But I now think there's more to the equation; tendencies that affect how we view information about not just heroes and adventurers, but also issues and events that affect local and national policy and action.
How is it that people can cling to an opinion or view of a person, event, issue of the world, despite being presented with clear or mounting data that contradicts that position? The easy answer, of course, is simply that people are irrational. But a closer look at some of the particular ways and reasons we're irrational offers some interesting food for thought.
In a recently published study, a group of researchers from Northwestern University, UNC Chapel HIll, SUNY Buffalo and Millsaps College found that people often employ an approach the researchers called "motivated reasoning" when sorting through new information or arguments, especially on controversial issues. Motivated reasoning is, as UCLA public policy professor Mark Kleiman put it, the equivalent of policy-driven data, instead of data-driven policy.
In other words, if people start with a particular opinion or view on a subject, any counter-evidence can create "cognitive dissonance"--discomfort caused by the presence of two irreconcilable ideas in the mind at once. One way of resolving the dissonance would be to change or alter the originally held opinion. But the researchers found that many people instead choose to change the conflicting evidence--selectively seeking out information or arguments that support their position while arguing around or ignoring any opposing evidence, even if that means using questionable or contorted logic.
That's not a news flash to anyone who's paid attention to any recent national debate--although the researchers pointed out that this finding, itself, runs counter to the idea that the reason people continue to hold positions counter to all evidence is because of misinformation or lack of access to the correct data. Even when presented with compelling, factual data from sources they trusted, many of the subjects still found ways to dismiss it. But the most interesting (or disturbing) aspect of the Northwestern study was the finding that providing additional counter-evidence, facts, or arguments actually intensified this reaction. Additional countering data, it seems, increases the cognitive dissonance, and therefore the need for subjects to alleviate that discomfort by retreating into more rigidly selective hearing and entrenched positions.
Needless to say, these findings do not bode well for anyone with hopes of changing anyone else's mind with facts or rational discussion, especially on "hot button" issues. But why do we cling so fiercely to positions when they don't even involve us directly? Why do we care who got to the North Pole first? Or whether a particular bill has provision X versus provision Y in it? Why don't we care more about simply finding out the truth--especially in cases where one "right" answer actually exists?
Part of the reason, according to Kleiman, is "the brute fact that people identify their opinions with themselves; to admit having been wrong is to have lost the argument, and (as Vince Lombardi said), every time you lose, you die a little." And, he adds, "there is no more destructive force in human affairs--not greed, not hatred--than the desire to have been right."
So, what do we do about that? If overcoming "the desire to have been right" is half as challenging as overcoming hate or greed, the outlook doesn't seem promising. But Kleiman, who specializes in crime control policy and alternative solutions to very sticky problems (his latest book is "When Brute Force Fails: How to Have Less Crime and Less Punishment"), thinks all is not lost. He points to the philosopher Karl Popper, who, he says, believed fiercely in the discipline and teaching of critical thinking, because "it allows us to offer up our opinions as a sacrifice, so that they die in our stead."
A liberal education, Kleiman says, "ought, above all, to be an education in non-attachment to one's current opinions. I would define a true intellectual as one who cares terribly about being right, and not at all about having been right." Easy to say, very hard to achieve. For all sorts of reasons. But it's worth thinking about. Even if it came at the cost of sacrificing or altering our most dearly-held opinions ... the truth might set us free.
Photo Credit: Flickr User Lanz, photolib.noaa.gov, Wikimedia Commons
When healthcare is at its best, hospitals are four-star hotels, and nurses, personal butlers at the ready—at least, that’s how many hospitals seem to interpret a government mandate.
When Department of Health and Human Services administrators decided to base 30 percent of hospitals’ Medicare reimbursement on patient satisfaction survey scores, they likely figured that transparency and accountability would improve healthcare. The Centers for Medicare and Medicaid Services (CMS) officials wrote, rather reasonably, “Delivery of high-quality, patient-centered care requires us to carefully consider the patient’s experience in the hospital inpatient setting.” They probably had no idea that their methods could end up indirectly harming patients.
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
J.J. Abrams, the director tasked with bringing Star Wars back to the top of the crowded franchise heap, has always been happy to borrow. When he set out to make a new Star Trek and drag that moribund cinematic franchise back into blockbuster territory, he cheerfully swapped in some very familiar visual language to help it over the hill. Early on in the film, James Kirk (Chris Pine), nursing a desire to transcend his farmboy life, rides a motorcycle to see the U.S.S. Enterprise being built at a shipyard, and gazes up at it longingly. Star Wars fans would connect the scene to one at the beginning of the first 1977 film, when Luke Skywalker wistfully watches the dual suns of his home planet set; Star Trek's producers even called the scene "our Tatooine moment." Abrams has never exactly been a visionary artist, but he's a master of elevating the familiar—a fact made clear in the previews of his new Star Wars film, The Force Awakens.
One of the hazards of being paid to think out loud is that most ideas are wrong, and some of those wrong ideas are bound to be yours.
Several years ago, I wrote a column with Jordan Weissmann, now the senior business and economics correspondent for Slate, about how young people, gutted by the Great Recession, might turn against the culture of suburban homes and cars, the two big-ticket items that have powered the country through previous recessions. For many years, my chief frustration with the article was that the only words that commenters seemed to read were also the only three words we didn't write: "The Cheapest Generation," which was the headline. But this week, I have another frustration with the article, which is that, inconveniently, reality is messing with our prediction.
In the shower I share with my three roommates in my apartment in Mexico City, there are all the things you’d expect to see: a few bottles of Body Shop-brand shampoos and conditioners, and a bar of soap—the organic-looking brown kind with tiny splinters of unrefined material protruding from the surface. But there are also two bottles of Lactacyd, a brand of feminine wash.
“You should use it,” my roommate tells me. She’s an astute, outspoken woman in her early 30s who works as a journalist for one of Mexico’s most well-known liberal magazines. “It’s meant to get rid of or prevent infections,” she said.
For more than half a century, douches, or feminine washes, have been a staple in pharmacies throughout the world. Yet, here in the Distrito Federal, douching is a trend that seems to have gained serious momentum in the last two years, according to Karla Font, a Distrito Federal-based gynecologist with many patients who actively douche. A worker at Farmacía Paris in the Historic District told me that every day they sell at least 30 bottles.
“Oh my God, can you grab him?” I shouted at the woman at the door, as my 3-month-old puppy darted out into the cold and I tried to stop my 6-year-old twins from following suit. She obliged, and I was able to get a proper look at her. It was in the 30s outside, unseasonably cold for Florida, and the young woman holding my squiggling puppy was wearing nothing but a light spring sweater, shivering and looking miserable. I invited her in.
Over a cup of coffee, she introduced herself as Tysharia Young and tried to do what she’d come to do: sell me overpriced magazine subscriptions. It was not the first time someone had knocked on my door for this purpose, and I was sure it wouldn’t be the last. Gainesville has had such issues with magazine sellers that our local police department recently issued a public warning.
The video, 25 years later, is almost as recognizable as the song itself, even though it conjures up images of soft-focus karaoke backing tracks and a million drunken vocal renditions of heartbreak. The camera scans over a road flanked on either side by tall trees, while a figure clad in black walks across the screen. Then there’s a misty shot of a bridge, a couple of pigeons flap their wings, and Sinead O’Connor’s face comes into focus: shorn, oval-eyed, seemingly disembodied, and completely indelible.
“It’s been seven hours,” she sings, “and fifteen days/ Since you took your love away.” Beneath her vocals, there’s just the sound of a single synthesized string note, before the drum track kicks in on the seventh line, just as O’Connor’s voice becomes an unmistakably Gaelic wail: “I can eat my dinner in a fancy restaurant/ But nothing/ I said nothing can take away these blues.”
And Americans? The land that gave the world the iPhone, the Declaration of Independence, and the Kinsey Report prefers emoji that depict technology, royalty, and… eggplants.
These preferences were revealed in a new report from SwiftKey, a software company that makes keyboards for iOS and Android phones. The report describes global trends in emoji usage and breaks them out by country and by language. Like nations themselves, it seems, emoji usage is also shaped by culture, climate, and geography.
What else did the report find? According to SwiftKey:
The most-used category of emoji used are “happy faces.” Happy faces, sad faces, and hearts make up more than 70 percent of global emoji usage.
Pope Francis is widely believed to be a cool Pope—a huggable, Upworthyish, meme-ready, self-deprecating leader for a new generation of worshippers. “He has described himself as a sinner,” writes Archbishop Desmond Tutu in Pope Francis’ entry on Time’s list of the 100 most influential people in the world, “and his nonjudgmental views on … issues such as sexual orientation and divorce have brought hope to millions of Roman Catholics around the world.”
But there’s one issue that can make even Cool Pope Francis himself sound a little, well, judgy. “A society with a greedy generation, that doesn’t want to surround itself with children, that considers them above all worrisome, a weight, a risk, is a depressed society,” the pontiff told an audience in St. Peter’s Square earlier this year. “The choice not to have children is selfish. Life rejuvenates and acquires energy when it multiplies: It is enriched, not impoverished.”
CHELSEA, Ma.—The woman Barry Berman saw sitting in the dining room of the nursing home was not his mother.
Or, at least, she was his mother, but didn’t look anything like her. His mother was vivacious, or she had been until she was felled by a massive stroke and then pneumonia, so he’d moved her into a nursing home so she could recuperate. He knew he could trust the nursing home, since he ran it, and knew it was lauded for the efficiency with which it served residents. But when he went to look for his mother a day or two after he moved her in, he barely recognized her.
“I’ll never forget the feeling as long as I live,” he told me. “I said, ‘Oh my God, there’s my mother, this old woman, in a wheelchair, lifeless. Look what my own nursing home did to my own mother in a matter of days.”