One hundred years ago this month, two intrepid explorers returned from the Arctic reaches and declared that they had reached the North Pole. Not together, but on competing expeditions to become the first person and team to the Pole. Robert E. Peary led one expedition, and Frederick A. Cook led the other. And each declared the other's claim to the Pole untrue.
Today, of course, that kind of controversy could be settled far more easily. At the very least, we would expect a GPS track record showing that the Pole had been reached, and airborne photographs or other corroborating evidence might be required, as well. Without that technology, however, the claims were a little harder to confirm. It's not like there was an exact marker at the spot, because nobody had been there before. And unlike the peak of Mt. Everest, the landscape at the precise location of the North Pole doesn't look distinctly different from the rest of the terrain--for hundreds of miles in any given direction.
So the controversy has raged for a full century. But here's the interesting part. As more data about the expeditions, and about the North Pole, have emerged, it seems more and more likely that neither man actually reached the Pole. As John Tierney wrote recently in the Science Times, Peary supposedly took no celestial navigation readings on his final push to the Pole, until one day he took a single reading, looked very disappointed, and then declared that the observation--which he showed to no one--confirmed that he'd arrived at the North Pole, exactly. Cook had neither a trained celestial navigator nor the skill to make the observations himself. Without that skill, how on earth (so to speak) could he have reached the Pole, or known precisely when he was there? The modern-day consensus, according to Tierney, is that Peary got closer than Cook, but that neither man got closer than perhaps 100 miles away.
Yet a full century and much more advanced data analysis and evidence later, Peary and Cook still have ardent supporters who adamantly believe that their hero told the truth. They suggest that it might have been possible for either explorer to have found the Pole without clear celestial sightings, by studying wind patterns in the snow, or observing shadows, or even by compass, even though a compass needle gets extremely erratic near the Earth's poles. Apparently, some of the Peary/Cook advocates are more comfortable with contorted logic than simply acknowledging that, given more data, it appears their initial impression of things was ... ummm ... wrong.
Peary and Cook are not the only explorers to have die-hard believers who have clung to a set vision of their heroes' lives despite the emergence of countering evidence. David Roberts, an editor at National Geographic Adventure, encountered a startling backlash of anger and even threats after writing a feature article last spring (which he's expanded into a soon-to-be-released book) that solved the mystery of a young adventurer's disappearance--but not the way some of the adventurer's admirers wanted it solved.
In 1934, at the age of 20, Everett Reuss left civilization to go live in the wilderness ... and was never heard from again. A whole folk myth movement sprang up around this young man who seemed to have slipped so completely into the wild that he eluded discovery for the rest of his life. An annual art festival in Escalante, Utah, is even named in his honor. But Roberts, who researched the case for 10 years, finally discovered evidence that Ruess had been murdered by two members of the Ute tribe almost as soon as he'd begun his journey. There was a witness to the murder, an unearthed skeleton, and DNA tests that were compatible with other family members.
The mystery, it seemed, had been solved. But the hue and cry surrounding Roberts' piece was both angry and loud, catching both Roberts and the Reuss family by surprise. "We all want our heroes to succeed," Reuss' nephew Brian surmised, in an attempt to explain the uproar. (A couple months ago, I wrote a longer essay about the Reuss controversy.)
Perhaps. But I now think there's more to the equation; tendencies that affect how we view information about not just heroes and adventurers, but also issues and events that affect local and national policy and action.
How is it that people can cling to an opinion or view of a person, event, issue of the world, despite being presented with clear or mounting data that contradicts that position? The easy answer, of course, is simply that people are irrational. But a closer look at some of the particular ways and reasons we're irrational offers some interesting food for thought.
In a recently published study, a group of researchers from Northwestern University, UNC Chapel HIll, SUNY Buffalo and Millsaps College found that people often employ an approach the researchers called "motivated reasoning" when sorting through new information or arguments, especially on controversial issues. Motivated reasoning is, as UCLA public policy professor Mark Kleiman put it, the equivalent of policy-driven data, instead of data-driven policy.
In other words, if people start with a particular opinion or view on a subject, any counter-evidence can create "cognitive dissonance"--discomfort caused by the presence of two irreconcilable ideas in the mind at once. One way of resolving the dissonance would be to change or alter the originally held opinion. But the researchers found that many people instead choose to change the conflicting evidence--selectively seeking out information or arguments that support their position while arguing around or ignoring any opposing evidence, even if that means using questionable or contorted logic.
That's not a news flash to anyone who's paid attention to any recent national debate--although the researchers pointed out that this finding, itself, runs counter to the idea that the reason people continue to hold positions counter to all evidence is because of misinformation or lack of access to the correct data. Even when presented with compelling, factual data from sources they trusted, many of the subjects still found ways to dismiss it. But the most interesting (or disturbing) aspect of the Northwestern study was the finding that providing additional counter-evidence, facts, or arguments actually intensified this reaction. Additional countering data, it seems, increases the cognitive dissonance, and therefore the need for subjects to alleviate that discomfort by retreating into more rigidly selective hearing and entrenched positions.
Needless to say, these findings do not bode well for anyone with hopes of changing anyone else's mind with facts or rational discussion, especially on "hot button" issues. But why do we cling so fiercely to positions when they don't even involve us directly? Why do we care who got to the North Pole first? Or whether a particular bill has provision X versus provision Y in it? Why don't we care more about simply finding out the truth--especially in cases where one "right" answer actually exists?
Part of the reason, according to Kleiman, is "the brute fact that people identify their opinions with themselves; to admit having been wrong is to have lost the argument, and (as Vince Lombardi said), every time you lose, you die a little." And, he adds, "there is no more destructive force in human affairs--not greed, not hatred--than the desire to have been right."
So, what do we do about that? If overcoming "the desire to have been right" is half as challenging as overcoming hate or greed, the outlook doesn't seem promising. But Kleiman, who specializes in crime control policy and alternative solutions to very sticky problems (his latest book is "When Brute Force Fails: How to Have Less Crime and Less Punishment"), thinks all is not lost. He points to the philosopher Karl Popper, who, he says, believed fiercely in the discipline and teaching of critical thinking, because "it allows us to offer up our opinions as a sacrifice, so that they die in our stead."
A liberal education, Kleiman says, "ought, above all, to be an education in non-attachment to one's current opinions. I would define a true intellectual as one who cares terribly about being right, and not at all about having been right." Easy to say, very hard to achieve. For all sorts of reasons. But it's worth thinking about. Even if it came at the cost of sacrificing or altering our most dearly-held opinions ... the truth might set us free.
Photo Credit: Flickr User Lanz, photolib.noaa.gov, Wikimedia Commons
Orr: “It’s a pleasure to meet you, Your Grace. My name is Tyrion Lannister.”
At last! I know I speak for quite a few book readers when I say that pretty much the only thing that kept me going through the eleventy thousand discursive, digressive pages of George R. R. Martin’s fifth tome, A Dance With Dragons, was the promise of Tyrion finally meeting up with Daenerys Targaryen. And, of course, after eleventy thousand pages, it never happened. So on behalf of myself and everyone else who sacrificed sleep, work, family, and friends waiting for this moment, let me say thank you, David Benioff and D. B. Weiss. Bonus points for what seemed to be a cameo by Strong Belwas (a book character who was written out of the show) as the nameless fighter who freed Tyrion from his chains.
New research confirms what they say about nice guys.
Smile at the customer. Bake cookies for your colleagues. Sing your subordinates’ praises. Share credit. Listen. Empathize. Don’t drive the last dollar out of a deal. Leave the last doughnut for someone else.
Sneer at the customer. Keep your colleagues on edge. Claim credit. Speak first. Put your feet on the table. Withhold approval. Instill fear. Interrupt. Ask for more. And by all means, take that last doughnut. You deserve it.
Follow one of those paths, the success literature tells us, and you’ll go far. Follow the other, and you’ll die powerless and broke. The only question is, which is which?
Of all the issues that preoccupy the modern mind—Nature or nurture? Is there life in outer space? Why can’t America field a decent soccer team?—it’s hard to think of one that has attracted so much water-cooler philosophizing yet so little scientific inquiry. Does it pay to be nice? Or is there an advantage to being a jerk?
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Some fans are complaining that Zack Snyder’s envisioning of the Man of Steel is too grim—but it’s less a departure than a return to the superhero’s roots.
Since the official teaser trailer for Batman v Superman: Dawn of Justice debuted online in April, fans and critics alike have been discussing the kind of Superman Zack Snyder is going to depict in his Man of Steel sequel. The controversy stems from Snyder’s decision to cast Superman as a brooding, Dark Knight-like character, who cares more about beating up bad guys than saving people. The casting split has proved divisive among Superman fans: Some love the new incarnation, citing him as an edgier, more realistic version of the character.
But Snyder’s is a different Superman than the one fans grew up with, and many have no problem expressing their outrage over it. Even Mark Waid, the author of Superman: Birthright (one of the comics the original film is based on), voiced his concern about Man of Steel’s turn toward bleakness when it came out in 2013:
The country’s political dysfunction has undermined all efforts to build an effective fighting force.
The Obama Administration has run out of patience with Iraq’s Army. On Sunday, Secretary of Defense Ashton Carter appeared on CNN’s “State of the Union” to discuss the recent fall of Ramadi, one of Iraq’s major cities, to ISIS. Despite possessing substantial advantages in both numbers and equipment, he said, the Iraqi military were unable to prevent ISIS forces from capturing the city.
“That says to me and, I think, to most of us, that we have an issue with the will of the Iraqis to fight ISIL and defend themselves.”
Carter’s frustrations are shared by his boss. When asked about the war against ISIS in a recent interview with the Atlantic’s Jeffrey Goldberg, President Obama said that “if the Iraqis are not willing to fight for the security of their country, then we cannot do it for them.”
Changing neighborhoods may be a class issue, but in America, that means it's also a racial one.
Ask city-dwellers to describe what, precisely, gentrification is you’ll get an array of answers. The term is a murky one, used to describe the many different ways through which money and development enter poorer or less developed neighborhoods, changing them both economically and demographically.
For some, gentrification and gentrifiers are inherently bad—pushing out residents who are often older, poorer, and darker than the neighborhood’s new occupants. For others, a new group of inhabitants brings the possibility of things residents have long hoped for, better grocery stores, new retail, renovations, and an overall revitalization that often eludes low-income neighborhoods.
Rebel groups that employ terror in civil wars seldom win or gain concessions—but they tend to prolong conflicts, a new paper finds.
Nearly 14 years into the war on terror, there are signs of terrorism all around us, from Memorial Day tributes to the victims of the wars in Iraq and Afghanistan to the raging congressional debate over reauthorizing the Patriot Act.
Yet some of the most basic information about terrorism remains surprisingly elusive. For example: Does it work?
There have been some attempts at answering the question, but many of them are either largely anecdotal or geographically constrained. Other studies have focused on international terror. But as political scientist Page Fortna of Columbia University notes, the vast majority of terrorism isn’t transnational—it’s localized, utilized in the context of civil wars and fights for territorial control. Many of the intractable conflicts the U.S. is involved in today fit this definition: the fighting between ISIS, Jabhat al-Nusra, and other groups in Iraq and Syria; the Boko Haram insurgency in Nigeria; al-Shabab’s terrorism in Somalia and Kenya; Yemen’s civil war; the Israel-Palestinian conflict. Is terrorism an effective tool when used in those conflicts?
In an interview, the U.S. president ties his legacy to a pact with Tehran, argues ISIS is not winning, warns Saudi Arabia not to pursue a nuclear-weapons program, and anguishes about Israel.
On Tuesday afternoon, as President Obama was bringing an occasionally contentious but often illuminating hour-long conversation about the Middle East to an end, I brought up a persistent worry. “A majority of American Jews want to support the Iran deal,” I said, “but a lot of people are anxiety-ridden about this, as am I.” Like many Jews—and also, by the way, many non-Jews—I believe that it is prudent to keep nuclear weapons out of the hands of anti-Semitic regimes. Obama, who earlier in the discussion had explicitly labeled the supreme leader of Iran, Ayatollah Ali Khamenei, an anti-Semite, responded with an argument I had not heard him make before.
“Look, 20 years from now, I’m still going to be around, God willing. If Iran has a nuclear weapon, it’s my name on this,” he said, referring to the apparently almost-finished nuclear agreement between Iran and a group of world powers led by the United States. “I think it’s fair to say that in addition to our profound national-security interests, I have a personal interest in locking this down.”
Steven Spielberg's D-Day epic is a brutal, unpatriotic portrait of war—except for the notoriously sappy prologue and epilogue. What was the film really trying to say?
When it was released 16 years ago, I didn't get it.
I knew Steven Spielberg's Saving Private Ryan was supposed to be a masterpiece. The best-known film critics in the country said so. Janet Maslin, for example, hailed it as "the finest war movie of our time." The film and its director both won Golden Globes, Spielberg received an Academy Award for directing, and more than 60 critics named Saving Private Ryan the best picture of the year.
The most serious students of the Second World War shared the enthusiasm for the film. Historian Stephen Ambrose, author of D-Day and Citizen Soldiers, thought it "the finest World War II movie ever made." The Secretary of the Army presented the filmmaker with the military's highest civilian decoration, the Distinguished Civilian Service Award. The New York Times even devoted a respectful editorial to "Spielberg's War."