One hundred years ago this month, two intrepid explorers returned from the Arctic reaches and declared that they had reached the North Pole. Not together, but on competing expeditions to become the first person and team to the Pole. Robert E. Peary led one expedition, and Frederick A. Cook led the other. And each declared the other's claim to the Pole untrue.
Today, of course, that kind of controversy could be settled far more easily. At the very least, we would expect a GPS track record showing that the Pole had been reached, and airborne photographs or other corroborating evidence might be required, as well. Without that technology, however, the claims were a little harder to confirm. It's not like there was an exact marker at the spot, because nobody had been there before. And unlike the peak of Mt. Everest, the landscape at the precise location of the North Pole doesn't look distinctly different from the rest of the terrain--for hundreds of miles in any given direction.
So the controversy has raged for a full century. But here's the interesting part. As more data about the expeditions, and about the North Pole, have emerged, it seems more and more likely that neither man actually reached the Pole. As John Tierney wrote recently in the Science Times, Peary supposedly took no celestial navigation readings on his final push to the Pole, until one day he took a single reading, looked very disappointed, and then declared that the observation--which he showed to no one--confirmed that he'd arrived at the North Pole, exactly. Cook had neither a trained celestial navigator nor the skill to make the observations himself. Without that skill, how on earth (so to speak) could he have reached the Pole, or known precisely when he was there? The modern-day consensus, according to Tierney, is that Peary got closer than Cook, but that neither man got closer than perhaps 100 miles away.
Yet a full century and much more advanced data analysis and evidence later, Peary and Cook still have ardent supporters who adamantly believe that their hero told the truth. They suggest that it might have been possible for either explorer to have found the Pole without clear celestial sightings, by studying wind patterns in the snow, or observing shadows, or even by compass, even though a compass needle gets extremely erratic near the Earth's poles. Apparently, some of the Peary/Cook advocates are more comfortable with contorted logic than simply acknowledging that, given more data, it appears their initial impression of things was ... ummm ... wrong.
Peary and Cook are not the only explorers to have die-hard believers who have clung to a set vision of their heroes' lives despite the emergence of countering evidence. David Roberts, an editor at National Geographic Adventure, encountered a startling backlash of anger and even threats after writing a feature article last spring (which he's expanded into a soon-to-be-released book) that solved the mystery of a young adventurer's disappearance--but not the way some of the adventurer's admirers wanted it solved.
In 1934, at the age of 20, Everett Reuss left civilization to go live in the wilderness ... and was never heard from again. A whole folk myth movement sprang up around this young man who seemed to have slipped so completely into the wild that he eluded discovery for the rest of his life. An annual art festival in Escalante, Utah, is even named in his honor. But Roberts, who researched the case for 10 years, finally discovered evidence that Ruess had been murdered by two members of the Ute tribe almost as soon as he'd begun his journey. There was a witness to the murder, an unearthed skeleton, and DNA tests that were compatible with other family members.
The mystery, it seemed, had been solved. But the hue and cry surrounding Roberts' piece was both angry and loud, catching both Roberts and the Reuss family by surprise. "We all want our heroes to succeed," Reuss' nephew Brian surmised, in an attempt to explain the uproar. (A couple months ago, I wrote a longer essay about the Reuss controversy.)
Perhaps. But I now think there's more to the equation; tendencies that affect how we view information about not just heroes and adventurers, but also issues and events that affect local and national policy and action.
How is it that people can cling to an opinion or view of a person, event, issue of the world, despite being presented with clear or mounting data that contradicts that position? The easy answer, of course, is simply that people are irrational. But a closer look at some of the particular ways and reasons we're irrational offers some interesting food for thought.
In a recently published study, a group of researchers from Northwestern University, UNC Chapel HIll, SUNY Buffalo and Millsaps College found that people often employ an approach the researchers called "motivated reasoning" when sorting through new information or arguments, especially on controversial issues. Motivated reasoning is, as UCLA public policy professor Mark Kleiman put it, the equivalent of policy-driven data, instead of data-driven policy.
In other words, if people start with a particular opinion or view on a subject, any counter-evidence can create "cognitive dissonance"--discomfort caused by the presence of two irreconcilable ideas in the mind at once. One way of resolving the dissonance would be to change or alter the originally held opinion. But the researchers found that many people instead choose to change the conflicting evidence--selectively seeking out information or arguments that support their position while arguing around or ignoring any opposing evidence, even if that means using questionable or contorted logic.
That's not a news flash to anyone who's paid attention to any recent national debate--although the researchers pointed out that this finding, itself, runs counter to the idea that the reason people continue to hold positions counter to all evidence is because of misinformation or lack of access to the correct data. Even when presented with compelling, factual data from sources they trusted, many of the subjects still found ways to dismiss it. But the most interesting (or disturbing) aspect of the Northwestern study was the finding that providing additional counter-evidence, facts, or arguments actually intensified this reaction. Additional countering data, it seems, increases the cognitive dissonance, and therefore the need for subjects to alleviate that discomfort by retreating into more rigidly selective hearing and entrenched positions.
Needless to say, these findings do not bode well for anyone with hopes of changing anyone else's mind with facts or rational discussion, especially on "hot button" issues. But why do we cling so fiercely to positions when they don't even involve us directly? Why do we care who got to the North Pole first? Or whether a particular bill has provision X versus provision Y in it? Why don't we care more about simply finding out the truth--especially in cases where one "right" answer actually exists?
Part of the reason, according to Kleiman, is "the brute fact that people identify their opinions with themselves; to admit having been wrong is to have lost the argument, and (as Vince Lombardi said), every time you lose, you die a little." And, he adds, "there is no more destructive force in human affairs--not greed, not hatred--than the desire to have been right."
So, what do we do about that? If overcoming "the desire to have been right" is half as challenging as overcoming hate or greed, the outlook doesn't seem promising. But Kleiman, who specializes in crime control policy and alternative solutions to very sticky problems (his latest book is "When Brute Force Fails: How to Have Less Crime and Less Punishment"), thinks all is not lost. He points to the philosopher Karl Popper, who, he says, believed fiercely in the discipline and teaching of critical thinking, because "it allows us to offer up our opinions as a sacrifice, so that they die in our stead."
A liberal education, Kleiman says, "ought, above all, to be an education in non-attachment to one's current opinions. I would define a true intellectual as one who cares terribly about being right, and not at all about having been right." Easy to say, very hard to achieve. For all sorts of reasons. But it's worth thinking about. Even if it came at the cost of sacrificing or altering our most dearly-held opinions ... the truth might set us free.
Photo Credit: Flickr User Lanz, photolib.noaa.gov, Wikimedia Commons
In a rare move, rank-and-file GOP lawmakers have joined with Democrats to force a vote on legislation reviving the Export-Import Bank.
It has taken nearly five years and the resignation of a speaker, but moderate Republicans in the House have taken their most aggressive step to undermine the influence of hard-right conservatives in the party.
A group of more than 50 GOP lawmakers joined nearly the entire Democratic caucus to force a vote on legislation reauthorizing the Export-Import Bank, the 80-year-old federal lending agency that shuttered when Republican leaders refused to renew its charter. The bipartisan coalition on Friday introduced the bill through a discharge petition, a rarely-used procedural mechanism that allows lawmakers to bypass both committees and the leadership to call up legislation signed by a majority of the House. It’s a maneuver that was last executed 13 years ago and only five times in the last eight decades, lawmakers said.
Even in big cities like Tokyo, small children take the subway and run errands by themselves. The reason has a lot to do with group dynamics.
It’s a common sight on Japanese mass transit: Children troop through train cars, singly or in small groups, looking for seats.
They wear knee socks, polished patent-leather shoes, and plaid jumpers, with wide-brimmed hats fastened under the chin and train passes pinned to their backpacks. The kids are as young as 6 or 7, on their way to and from school, and there is nary a guardian in sight.
A popular television show called Hajimete no Otsukai, or My First Errand, features children as young as two or three being sent out to do a task for their family. As they tentatively make their way to the greengrocer or bakery, their progress is secretly filmed by a camera crew. The show has been running for more than 25 years.
“Wanting and not wanting the same thing at the same time is a baseline condition of human consciousness.”
Gary Noesner is a former FBI hostage negotiator. For part of the 51-day standoff outside the Branch Davidian religious compound in Waco, Texas, in 1993, he was the strategic coordinator for negotiations with the compound’s leader, David Koresh. This siege ended in infamous tragedy: The FBI launched a tear-gas attack on the compound, which burned to the ground, killing 76 people inside. But before Noesner was rotated out of his position as the siege’s head negotiator, he and his team secured the release of 35 people.
Jamie Holmes, a Future Tense Fellow at New America, spoke to Noesner for his new book Nonsense: The Power of Not Knowing. “My experience suggests,” Noesner told Holmes, “that in the overwhelming majority of these cases, people are confused and ambivalent. Part of them wants to die, part of them wants to live. Part of them wants to surrender, part of them doesn’t want to surrender.” And good negotiators, Noesner says, are “people who can dwell fairly effectively in the areas of gray, in the uncertainties and ambiguities of life.”
Some of Charles Schulz’s fans blame the cartoon dog for ruining Peanuts. Here’s why they’re wrong.
It really was a dark and stormy night. On February 12, 2000, Charles Schulz—who had single-handedly drawn some 18,000 Peanuts comic strips, who refused to use assistants to ink or letter his comics, who vowed that after he quit, no new Peanuts strips would be made—died, taking to the grave, it seemed, any further adventures of the gang.
Hours later, his last Sunday strip came out with a farewell: “Charlie Brown, Snoopy, Linus, Lucy … How can I ever forget them.” By then, Peanuts was carried by more than 2,600 newspapers in 75 countries and read by some 300 million people. It had been going for five decades. Robert Thompson, a scholar of popular culture, called it “arguably the longest story told by a single artist in human history.”
Meanwhile, the mood at the conference has been decidedly less complimentary, with several geneticists criticizing the methods presented in the talk, the validity of the results, and the coverage in the press.
No defensible moral framework regards foreigners as less deserving of rights than people born in the right place at the right time.
To paraphrase Rousseau, man is born free, yet everywhere he is caged. Barbed-wire, concrete walls, and gun-toting guards confine people to the nation-state of their birth. But why? The argument for open borders is both economic and moral. All people should be free to move about the earth, uncaged by the arbitrary lines known as borders.
Not every place in the world is equally well-suited to mass economic activity. Nature’s bounty is divided unevenly. Variations in wealth and income created by these differences are magnified by governments that suppress entrepreneurship and promote religious intolerance, gender discrimination, or other bigotry. Closed borders compound these injustices, cementing inequality into place and sentencing their victims to a life of penury.
Ben Carson is wrong to say armed Jews could have stopped Hitler. But so are those who compare Europe’s refugee crisis to the same period.
How about a pact: If the political right in the United States ceases invoking the Holocaust to justify gun laws that enable the killing of innocents, as Republican presidential candidate Ben Carson did on Thursday, the left quits invoking the Holocaust as justification for migration policies that could make the Europe of the future even less hospitable to its remaining Jews than the Europe of today.
The claim that the Jews of Europe could have stopped the Nazi Holocaust if only they’d possessed more rifles and pistols is a claim based on almost perfect ignorance of the events of 1933 to 1945. The mass murder of European Jews could proceed only after the Nazis had defeated or seized territory from three of the mightiest aggregations of armed force on earth: the armies of France, Poland, and the Soviet Union. The opponents of the Nazis not only possessed rifles and pistols, but also tanks, aircraft, artillery, modern fortifications, and massed infantry. And yes, Jews bore those weapons too: nearly 200,000 in the Polish armed forces, for example.
A six-month investigation found a decade of sexual harassment complaints against famous astronomer Geoff Marcy to be credible.
Geoff Marcy is a superstar astronomer, by any measure. He is a major figure in the exoplanet revolution, which has transformed our view of the universe so profoundly, that some have compared it to the revolution kicked off by Copernicus. Many of the first thousand planets observed circling other stars were detected by teams Marcy led. When history books about early 21st century science are written, Marcy's name will be in them. Indeed, many wondered whether his name might be called earlier this week, when the Nobel prizes were announced.
Instead, Marcy found his way into the news for a different reason. Yesterday, BuzzFeed published details from an investigation conducted by the University of California, Berkeley into repeated complaints that Marcy sexually harassed students:
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
American politicians are now eager to disown a failed criminal-justice system that’s left the U.S. with the largest incarcerated population in the world. But they've failed to reckon with history. Fifty years after Daniel Patrick Moynihan’s report “The Negro Family” tragically helped create this system, it's time to reclaim his original intent.
By his own lights, Daniel Patrick Moynihan, ambassador, senator, sociologist, and itinerant American intellectual, was the product of a broken home and a pathological family. He was born in 1927 in Tulsa, Oklahoma, but raised mostly in New York City. When Moynihan was 10 years old, his father, John, left the family, plunging it into poverty. Moynihan’s mother, Margaret, remarried, had another child, divorced, moved to Indiana to stay with relatives, then returned to New York, where she worked as a nurse. Moynihan’s childhood—a tangle of poverty, remarriage, relocation, and single motherhood—contrasted starkly with the idyllic American family life he would later extol.