One hundred years ago this month, two intrepid explorers returned from the Arctic reaches and declared that they had reached the North Pole. Not together, but on competing expeditions to become the first person and team to the Pole. Robert E. Peary led one expedition, and Frederick A. Cook led the other. And each declared the other's claim to the Pole untrue.
Today, of course, that kind of controversy could be settled far more easily. At the very least, we would expect a GPS track record showing that the Pole had been reached, and airborne photographs or other corroborating evidence might be required, as well. Without that technology, however, the claims were a little harder to confirm. It's not like there was an exact marker at the spot, because nobody had been there before. And unlike the peak of Mt. Everest, the landscape at the precise location of the North Pole doesn't look distinctly different from the rest of the terrain--for hundreds of miles in any given direction.
So the controversy has raged for a full century. But here's the interesting part. As more data about the expeditions, and about the North Pole, have emerged, it seems more and more likely that neither man actually reached the Pole. As John Tierney wrote recently in the Science Times, Peary supposedly took no celestial navigation readings on his final push to the Pole, until one day he took a single reading, looked very disappointed, and then declared that the observation--which he showed to no one--confirmed that he'd arrived at the North Pole, exactly. Cook had neither a trained celestial navigator nor the skill to make the observations himself. Without that skill, how on earth (so to speak) could he have reached the Pole, or known precisely when he was there? The modern-day consensus, according to Tierney, is that Peary got closer than Cook, but that neither man got closer than perhaps 100 miles away.
Yet a full century and much more advanced data analysis and evidence later, Peary and Cook still have ardent supporters who adamantly believe that their hero told the truth. They suggest that it might have been possible for either explorer to have found the Pole without clear celestial sightings, by studying wind patterns in the snow, or observing shadows, or even by compass, even though a compass needle gets extremely erratic near the Earth's poles. Apparently, some of the Peary/Cook advocates are more comfortable with contorted logic than simply acknowledging that, given more data, it appears their initial impression of things was ... ummm ... wrong.
Peary and Cook are not the only explorers to have die-hard believers who have clung to a set vision of their heroes' lives despite the emergence of countering evidence. David Roberts, an editor at National Geographic Adventure, encountered a startling backlash of anger and even threats after writing a feature article last spring (which he's expanded into a soon-to-be-released book) that solved the mystery of a young adventurer's disappearance--but not the way some of the adventurer's admirers wanted it solved.
In 1934, at the age of 20, Everett Reuss left civilization to go live in the wilderness ... and was never heard from again. A whole folk myth movement sprang up around this young man who seemed to have slipped so completely into the wild that he eluded discovery for the rest of his life. An annual art festival in Escalante, Utah, is even named in his honor. But Roberts, who researched the case for 10 years, finally discovered evidence that Ruess had been murdered by two members of the Ute tribe almost as soon as he'd begun his journey. There was a witness to the murder, an unearthed skeleton, and DNA tests that were compatible with other family members.
The mystery, it seemed, had been solved. But the hue and cry surrounding Roberts' piece was both angry and loud, catching both Roberts and the Reuss family by surprise. "We all want our heroes to succeed," Reuss' nephew Brian surmised, in an attempt to explain the uproar. (A couple months ago, I wrote a longer essay about the Reuss controversy.)
Perhaps. But I now think there's more to the equation; tendencies that affect how we view information about not just heroes and adventurers, but also issues and events that affect local and national policy and action.
How is it that people can cling to an opinion or view of a person, event, issue of the world, despite being presented with clear or mounting data that contradicts that position? The easy answer, of course, is simply that people are irrational. But a closer look at some of the particular ways and reasons we're irrational offers some interesting food for thought.
In a recently published study, a group of researchers from Northwestern University, UNC Chapel HIll, SUNY Buffalo and Millsaps College found that people often employ an approach the researchers called "motivated reasoning" when sorting through new information or arguments, especially on controversial issues. Motivated reasoning is, as UCLA public policy professor Mark Kleiman put it, the equivalent of policy-driven data, instead of data-driven policy.
In other words, if people start with a particular opinion or view on a subject, any counter-evidence can create "cognitive dissonance"--discomfort caused by the presence of two irreconcilable ideas in the mind at once. One way of resolving the dissonance would be to change or alter the originally held opinion. But the researchers found that many people instead choose to change the conflicting evidence--selectively seeking out information or arguments that support their position while arguing around or ignoring any opposing evidence, even if that means using questionable or contorted logic.
That's not a news flash to anyone who's paid attention to any recent national debate--although the researchers pointed out that this finding, itself, runs counter to the idea that the reason people continue to hold positions counter to all evidence is because of misinformation or lack of access to the correct data. Even when presented with compelling, factual data from sources they trusted, many of the subjects still found ways to dismiss it. But the most interesting (or disturbing) aspect of the Northwestern study was the finding that providing additional counter-evidence, facts, or arguments actually intensified this reaction. Additional countering data, it seems, increases the cognitive dissonance, and therefore the need for subjects to alleviate that discomfort by retreating into more rigidly selective hearing and entrenched positions.
Needless to say, these findings do not bode well for anyone with hopes of changing anyone else's mind with facts or rational discussion, especially on "hot button" issues. But why do we cling so fiercely to positions when they don't even involve us directly? Why do we care who got to the North Pole first? Or whether a particular bill has provision X versus provision Y in it? Why don't we care more about simply finding out the truth--especially in cases where one "right" answer actually exists?
Part of the reason, according to Kleiman, is "the brute fact that people identify their opinions with themselves; to admit having been wrong is to have lost the argument, and (as Vince Lombardi said), every time you lose, you die a little." And, he adds, "there is no more destructive force in human affairs--not greed, not hatred--than the desire to have been right."
So, what do we do about that? If overcoming "the desire to have been right" is half as challenging as overcoming hate or greed, the outlook doesn't seem promising. But Kleiman, who specializes in crime control policy and alternative solutions to very sticky problems (his latest book is "When Brute Force Fails: How to Have Less Crime and Less Punishment"), thinks all is not lost. He points to the philosopher Karl Popper, who, he says, believed fiercely in the discipline and teaching of critical thinking, because "it allows us to offer up our opinions as a sacrifice, so that they die in our stead."
A liberal education, Kleiman says, "ought, above all, to be an education in non-attachment to one's current opinions. I would define a true intellectual as one who cares terribly about being right, and not at all about having been right." Easy to say, very hard to achieve. For all sorts of reasons. But it's worth thinking about. Even if it came at the cost of sacrificing or altering our most dearly-held opinions ... the truth might set us free.
Photo Credit: Flickr User Lanz, photolib.noaa.gov, Wikimedia Commons
Orr: “Sometimes a thing happens. Splits your life. There’s a before and after. I got like five of them at this point.”
This was Frank offering a pep talk to the son of his murdered former henchman Stan in tonight’s episode. (More on this in a moment.) But it’s also a line that captures this season of True Detective so perfectly that it almost seems like a form of subliminal self-critique.
Remember when Ray got shot in episode two and appeared to be dead but came back with a renewed sense of purpose and stopped drinking. No? That’s okay. Neither does the show: It was essentially forgotten after the subsequent episode. Remember when half a dozen (or more) Vinci cops were killed in a bloody shootout along with dozen(s?) of civilians? No? Fine: True Detective’s left that behind, too. Unless I missed it, there was not a single mention of this nationally historic bloodbath tonight.
How a radical epilepsy treatment in the early 20th century paved the way for modern-day understandings of perception, consciousness, and the self
In 1939, a group of 10 people between the ages of 10 and 43, all with epilepsy, traveled to the University of Rochester Medical Center, where they would become the first people to undergo a radical new surgery.
The patients were there because they all struggled with violent and uncontrollable seizures. The procedure they were about to have was untested on humans, but they were desperate—none of the standard drug therapies for seizures had worked.
Between February and May of 1939, their surgeon William Van Wagenen, Rochester’s chief of neurosurgery, opened up each patient’s skull and cut through the corpus callosum, the part of the brain that connects the left hemisphere to the right and is responsible for the transfer of information between them. It was a dramatic move: By slicing through the bundle of neurons connecting the two hemispheres, Van Wagenen was cutting the left half of the brain away from the right, halting all communication between the two.
Educators seldom have enough time to do their business. What’s that doing to the state of learning?
It’s common knowledge that teachers today are stressed, that they feel underappreciated and disrespected, and disillusioned. It’s no wonder they’re ditching the classroom at such high rates—to the point where states from Indiana to Arizona to Kansas are dealing with teacher shortages. Meanwhile, the number of American students who go into teaching is steadily dropping.
A recent survey conducted jointly by the American Federation of Teachers and Badass Teachers Association asked educators about the quality of their worklife, and it got some pretty harrowing feedback. Just 15 percent of the 30,000 respondents, for example, strongly agreed that they’re enthusiastic about the profession. Compare that to the roughly 90 percent percent who strongly agreed that they were enthusiastic about it when they started their career, and it’s clear that something has changed about schools that’s pushing them away. Roughly three in four respondents said they “often” feel stressed by their jobs.
Has the Obama administration’s pursuit of new beginnings blinded it to enduring enmities?
“The president said many times he’s willing to step out of the rut of history.” In this way Ben Rhodes of the White House, who over the years has broken new ground in the grandiosity of presidential apologetics, described the courage of Barack Obama in concluding the Joint Comprehensive Plan of Action with the Islamic Republic of Iran, otherwise known as the Iran deal. Once again Rhodes has, perhaps inadvertently, exposed the president’s premises more clearly than the president likes to do. The rut of history: It is a phrase worth pondering. It expresses a deep scorn for the past, a zeal for newness and rupture, an arrogance about old struggles and old accomplishments, a hastiness with inherited precedents and circumstances, a superstition about the magical powers of the present. It expresses also a generational view of history, which, like the view of history in terms of decades and centuries, is one of the shallowest views of all.
A controversial treatment shows promise, especially for victims of trauma.
It’s straight out of a cartoon about hypnosis: A black-cloaked charlatan swings a pendulum in front of a patient, who dutifully watches and ping-pongs his eyes in turn. (This might be chased with the intonation, “You are getting sleeeeeepy...”)
Unlike most stereotypical images of mind alteration—“Psychiatric help, 5 cents” anyone?—this one is real. An obscure type of therapy known as EMDR, or Eye Movement Desensitization and Reprocessing, is gaining ground as a potential treatment for people who have experienced severe forms of trauma.
Here’s the idea: The person is told to focus on the troubling image or negative thought while simultaneously moving his or her eyes back and forth. To prompt this, the therapist might move his fingers from side to side, or he might use a tapping or waving of a wand. The patient is told to let her mind go blank and notice whatever sensations might come to mind. These steps are repeated throughout the session.
Companies that overvalue alpha-male behavior need to change—both to retain female talent and for the bottom line.
When it comes to gender equality in the workplace, the research on its economic benefits is clear: Equality can boost profits and enhance reputation. And then there’s also the fact that it’s more fair. But the progress of women in the workplace is so far inadequate: Women are woefully underrepresented in executive positions, the pay gap persists, and the motherhood penalty is very real.
Barbara Annis is the founder of the Gender Intelligence Group, a consultancy that works with executives at major firms (including Deloitte, American Express, BMO Financial Group, and eBay) to create strategies to transform their work cultures into ones that are friendly to both men and women.
I recently spoke with Annis about her work and the challenges to achieving gender parity. The following transcript of our conversation has been edited for clarity.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Exceptional nonfiction stories from 2014 that are still worth encountering today
Each year, I keep a running list of exceptional nonfiction that I encounter as I publish The Best ofJournalism, an email newsletter that I send out once or twice a week. This is my annual attempt to bring some of those stories to a wider audience. I could not read or note every worthy article that was published last calendar year and I haven't included any paywalled articles or anything published at The Atlantic. But everything that follows is worthy of wider attention and engagement.
Millions of workers now go it alone—who will provide them with basic labor protections?
When Sara Horowitz founded the Freelancers Union in 1995, there was already evidence that the structure of people's work lives was changing.
Publishing and media jobs had started to move to more project-based work. Horowitz, a union organizer and labor lawyer by training, assumed that other industries would follow. As an expert in labor unions, she thought “it was really important to start thinking about how people [can] come together” to change laws and public policy, so that freelancers can obtain job-related “benefits—and community.”
Today, the Brooklyn-based Freelancers Union boasts nearly 300,000 members, having quadrupled in numbers in just seven years. Freelancers in the union include technology consultants, copywriters, web designers, visual artists, business-development consultants, journalists, and professional coaches. They live all over the country, with concentrations in New York, New Jersey, and California.
What Westerners migrating to ISIS have in common with Westerners who sympathized with communism
In Political Pilgrims, the sociologist Paul Hollander exposes and excoriates the mentality of a certain kind of Western intellectual, who, such is the depth of his estrangement or alienation from his own society, is predisposed to extend sympathy to virtually any opposing political system.
The book is about the travels of 20th-century Western intellectuals to the Soviet Union, China, and Cuba, and how these political travelers were able to find in such repressive countries a model of “the good society” in which they could invest their brightest hopes. Hollander documents in relentless and mortifying detail how this utopian impulse, driven by a deep discontent with their own societies, led them to deny or excuse the myriad moral defects of the places they visited.