One hundred years ago this month, two intrepid explorers returned from the Arctic reaches and declared that they had reached the North Pole. Not together, but on competing expeditions to become the first person and team to the Pole. Robert E. Peary led one expedition, and Frederick A. Cook led the other. And each declared the other's claim to the Pole untrue.
Today, of course, that kind of controversy could be settled far more easily. At the very least, we would expect a GPS track record showing that the Pole had been reached, and airborne photographs or other corroborating evidence might be required, as well. Without that technology, however, the claims were a little harder to confirm. It's not like there was an exact marker at the spot, because nobody had been there before. And unlike the peak of Mt. Everest, the landscape at the precise location of the North Pole doesn't look distinctly different from the rest of the terrain--for hundreds of miles in any given direction.
So the controversy has raged for a full century. But here's the interesting part. As more data about the expeditions, and about the North Pole, have emerged, it seems more and more likely that neither man actually reached the Pole. As John Tierney wrote recently in the Science Times, Peary supposedly took no celestial navigation readings on his final push to the Pole, until one day he took a single reading, looked very disappointed, and then declared that the observation--which he showed to no one--confirmed that he'd arrived at the North Pole, exactly. Cook had neither a trained celestial navigator nor the skill to make the observations himself. Without that skill, how on earth (so to speak) could he have reached the Pole, or known precisely when he was there? The modern-day consensus, according to Tierney, is that Peary got closer than Cook, but that neither man got closer than perhaps 100 miles away.
Yet a full century and much more advanced data analysis and evidence later, Peary and Cook still have ardent supporters who adamantly believe that their hero told the truth. They suggest that it might have been possible for either explorer to have found the Pole without clear celestial sightings, by studying wind patterns in the snow, or observing shadows, or even by compass, even though a compass needle gets extremely erratic near the Earth's poles. Apparently, some of the Peary/Cook advocates are more comfortable with contorted logic than simply acknowledging that, given more data, it appears their initial impression of things was ... ummm ... wrong.
Peary and Cook are not the only explorers to have die-hard believers who have clung to a set vision of their heroes' lives despite the emergence of countering evidence. David Roberts, an editor at National Geographic Adventure, encountered a startling backlash of anger and even threats after writing a feature article last spring (which he's expanded into a soon-to-be-released book) that solved the mystery of a young adventurer's disappearance--but not the way some of the adventurer's admirers wanted it solved.
In 1934, at the age of 20, Everett Reuss left civilization to go live in the wilderness ... and was never heard from again. A whole folk myth movement sprang up around this young man who seemed to have slipped so completely into the wild that he eluded discovery for the rest of his life. An annual art festival in Escalante, Utah, is even named in his honor. But Roberts, who researched the case for 10 years, finally discovered evidence that Ruess had been murdered by two members of the Ute tribe almost as soon as he'd begun his journey. There was a witness to the murder, an unearthed skeleton, and DNA tests that were compatible with other family members.
The mystery, it seemed, had been solved. But the hue and cry surrounding Roberts' piece was both angry and loud, catching both Roberts and the Reuss family by surprise. "We all want our heroes to succeed," Reuss' nephew Brian surmised, in an attempt to explain the uproar. (A couple months ago, I wrote a longer essay about the Reuss controversy.)
Perhaps. But I now think there's more to the equation; tendencies that affect how we view information about not just heroes and adventurers, but also issues and events that affect local and national policy and action.
How is it that people can cling to an opinion or view of a person, event, issue of the world, despite being presented with clear or mounting data that contradicts that position? The easy answer, of course, is simply that people are irrational. But a closer look at some of the particular ways and reasons we're irrational offers some interesting food for thought.
In a recently published study, a group of researchers from Northwestern University, UNC Chapel HIll, SUNY Buffalo and Millsaps College found that people often employ an approach the researchers called "motivated reasoning" when sorting through new information or arguments, especially on controversial issues. Motivated reasoning is, as UCLA public policy professor Mark Kleiman put it, the equivalent of policy-driven data, instead of data-driven policy.
In other words, if people start with a particular opinion or view on a subject, any counter-evidence can create "cognitive dissonance"--discomfort caused by the presence of two irreconcilable ideas in the mind at once. One way of resolving the dissonance would be to change or alter the originally held opinion. But the researchers found that many people instead choose to change the conflicting evidence--selectively seeking out information or arguments that support their position while arguing around or ignoring any opposing evidence, even if that means using questionable or contorted logic.
That's not a news flash to anyone who's paid attention to any recent national debate--although the researchers pointed out that this finding, itself, runs counter to the idea that the reason people continue to hold positions counter to all evidence is because of misinformation or lack of access to the correct data. Even when presented with compelling, factual data from sources they trusted, many of the subjects still found ways to dismiss it. But the most interesting (or disturbing) aspect of the Northwestern study was the finding that providing additional counter-evidence, facts, or arguments actually intensified this reaction. Additional countering data, it seems, increases the cognitive dissonance, and therefore the need for subjects to alleviate that discomfort by retreating into more rigidly selective hearing and entrenched positions.
Needless to say, these findings do not bode well for anyone with hopes of changing anyone else's mind with facts or rational discussion, especially on "hot button" issues. But why do we cling so fiercely to positions when they don't even involve us directly? Why do we care who got to the North Pole first? Or whether a particular bill has provision X versus provision Y in it? Why don't we care more about simply finding out the truth--especially in cases where one "right" answer actually exists?
Part of the reason, according to Kleiman, is "the brute fact that people identify their opinions with themselves; to admit having been wrong is to have lost the argument, and (as Vince Lombardi said), every time you lose, you die a little." And, he adds, "there is no more destructive force in human affairs--not greed, not hatred--than the desire to have been right."
So, what do we do about that? If overcoming "the desire to have been right" is half as challenging as overcoming hate or greed, the outlook doesn't seem promising. But Kleiman, who specializes in crime control policy and alternative solutions to very sticky problems (his latest book is "When Brute Force Fails: How to Have Less Crime and Less Punishment"), thinks all is not lost. He points to the philosopher Karl Popper, who, he says, believed fiercely in the discipline and teaching of critical thinking, because "it allows us to offer up our opinions as a sacrifice, so that they die in our stead."
A liberal education, Kleiman says, "ought, above all, to be an education in non-attachment to one's current opinions. I would define a true intellectual as one who cares terribly about being right, and not at all about having been right." Easy to say, very hard to achieve. For all sorts of reasons. But it's worth thinking about. Even if it came at the cost of sacrificing or altering our most dearly-held opinions ... the truth might set us free.
Photo Credit: Flickr User Lanz, photolib.noaa.gov, Wikimedia Commons
A new report details a black market in nuclear materials.
On Wednesday, the Associated Press published a horrifying report about criminal networks in the former Soviet Union trying to sell “radioactive material to Middle Eastern extremists.” At the center of these cases, of which the AP learned of four in the past five years, was a “thriving black market in nuclear materials” in a “tiny and impoverished Eastern European country”: Moldova.
It’s a new iteration of an old problem with a familiar geography. The breakup of the Soviet Union left a superpower’s worth of nuclear weapons scattered across several countries without a superpower’s capacity to keep track of them. When Harvard’s Graham Allison flagged this problem in 1996, he wrote that the collapse of Russia’s “command-and-control society” left nothing secure. To wit:
It leaves people bed-bound and drives some to suicide, but there's little research money devoted to the disease. Now, change is coming, thanks to the patients themselves.
This past July, Brian Vastag, a former science reporter, placed an op-ed with his former employer, the Washington Post. It was an open letter to the National Institutes of Health director Francis Collins, a man Vastag had formerly used as a source on his beat.
“I’ve been felled by the most forlorn of orphan illnesses,” Vastag wrote. “At 43, my productive life may well be over.”
There was no cure for his disease, known by some as chronic fatigue syndrome, Vastag wrote, and little NIH funding available to search for one. Would Collins step up and change that?
“As the leader of our nation’s medical research enterprise, you have a decision to make,” he wrote. “Do you want the NIH to be part of these solutions, or will the nation’s medical research agency continue to be part of the problem?”
Also notable about this brazen show of might is that the missiles traveled through two countries, Iran and Iraq, before hitting their 11 targets in Syria. This means that both countries either gave their permission or simply didn’t confront Putin about the use of their airspace on his birthday.
Why Americans tend more and more to want inexperienced presidential candidates
The presidency, it’s often said, is a job for which everyone arrives unprepared. But just how unprepared is unprepared enough?
Political handicappers weigh presidential candidates’ partisanship, ideology, money, endorsements, consultants, and, of course, experience. Yet they too rarely consider an element of growing importance to voters: freshness. Increasingly, American voters view being qualified for the presidency as a disqualification.
In 2003, I announced in National Journal the 14-Year Rule. The rule was actually discovered by a presidential speechwriter named John McConnell, but because his job required him to keep his name out of print, I graciously stepped up to take credit. It is well known that to be elected president, you pretty much have to have been a governor or a U.S. senator. What McConnell had figured out was this: No one gets elected president who needs longer than 14 years to get from his or her first gubernatorial or Senate victory to either the presidency or the vice presidency.* Surprised, I scoured the history books and found that the rule works astonishingly well going back to the early 20th century, when the modern era of presidential electioneering began.
Somewhere in Europe, a man who goes by the name “Mikro” spends his days and nights targeting Islamic State supporters on Twitter.
In August 2014, a Twitter account affiliated with Anonymous, the hacker-crusader collective, declared “full-scale cyber war” against ISIS: “Welcome to Operation Ice #ISIS, where #Anonymous will do it’s [sic] part in combating #ISIS’s influence in social media and shut them down.”
In July, I traveled to a gloomy European capital city to meet one of the “cyber warriors” behind this operation. Online, he goes by the pseudonym Mikro. He is vigilant, bordering on paranoid, about hiding his actual identity, on account of all the death threats he has received. But a few months after I initiated a relationship with him on Twitter, Mikro allowed me to visit him in the apartment he shares with his girlfriend and two Rottweilers. He works alone from his chaotic living room, using an old, battered computer—not the state-of-the-art setup I had envisaged. On an average day, he told me, he spends up to 16 hours fixed to his sofa. He starts around noon, just after he wakes up, and works late into the night and early morning.
Forget the Common Core, Finland’s youngsters are in charge of determining what happens in the classroom.
“The changes to kindergarten make me sick,” a veteran teacher in Arkansas recently admitted to me. “Think about what you did in first grade—that’s what my 5-year-old babies are expected to do.”
The difference between first grade and kindergarten may not seem like much, but what I remember about my first-grade experience in the mid-90s doesn’t match the kindergarten she described in her email: three and a half hours of daily literacy instruction, an hour and a half of daily math instruction, 20 minutes of daily “physical activity time” (officially banned from being called “recess”) and two 56-question standardized tests in literacy and math—on the fourth week of school.
That American friend—who teaches 20 students without an aide—has fought to integrate 30 minutes of “station time” into the literacy block, which includes “blocks, science, magnetic letters, play dough with letter stamps to practice words, books, and storytelling.” But the most controversial area of her classroom isn’t the blocks nor the stamps: Rather, it’s the “house station with dolls and toy food”—items her district tried to remove last year. The implication was clear: There’s no time for play in kindergarten anymore.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
National Geographic Magazine has opened its annual photo contest, with the deadline for submissions coming up on November 16, 2015.
National Geographic Magazine has opened its annual photo contest, with the deadline for submissions coming up on November 16, 2015. The Grand Prize Winner will receive $10,000 and a trip to National Geographic headquarters to participate in its annual photography seminar. The kind folks at National Geographic were once again kind enough to let me choose among the contest entries so far for display here. Captions written by the individual photographers.
The Red Planet once had an ocean and a magnetic field. A new mission is setting out to discover what happened to them.
The question of whether there is life on Mars is woven into a much larger thatch of mysteries. Among them: What happened to the ancient ocean that once covered a quarter of the planet’s surface? And, relatedly, what made Mars’s magnetosphere fade away? Why did a planet that may have looked something like Earth turn into a dry red husk?
“We see magnetized rocks on the Mars surface,” said Bruce Banerdt, the principal investigator of the InSight mission to Mars, which is set to launch in March. “And so we know Mars had a magnetic field at one time, but it doesn't today. We would like to know the history—when that magnetic field started, when it may have shut down.”
There are a few leading theories about what decimated the planet’s magnetism. One of them is that huge asteroids bombarded Mars until its magnetic field turned off. That storm of asteroids may have included one enormous rock in particular, even bigger than the one believed to have wiped out Earth’s dinosaurs. Another theory explores the possibility that Mars’s ancient magnetic field only ever covered one of its hemispheres, an idea that would also explain how the planet’s magnetism weakened over time. “The presence of a magnetic field is key to understanding the history of Mars’s atmosphere, which of course is key to habitability on Mars’s surface,” Banerdt told me.
What will happen to digital collections of books, movies, and music when the tech giants fall?
When you purchase a movie from Amazon Instant Video, you’re not buying it, exactly. It’s more like renting indefinitely.
This distinction matters if your notion of “buying” is that you pay for something once and then you get to keep that thing for as long as you want. Increasingly, in the world of digital goods, a purchasing transaction isn’t that simple.
There are two key differences between buying media in a physical format versus a digital one. First, there’s the technical aspect: Maintaining long-term access to a file requires a hard copy of it—that means, for example, downloading a film, not just streaming from a third party’s server. The second distinction is a bit more complicated, and it has to do with how the law has shaped digital rights in the past 15 years. It helps to think about the experience of a person giving up CDs and using iTunes for music purchases instead.