One hundred years ago this month, two intrepid explorers returned from the Arctic reaches and declared that they had reached the North Pole. Not together, but on competing expeditions to become the first person and team to the Pole. Robert E. Peary led one expedition, and Frederick A. Cook led the other. And each declared the other's claim to the Pole untrue.
Today, of course, that kind of controversy could be settled far more easily. At the very least, we would expect a GPS track record showing that the Pole had been reached, and airborne photographs or other corroborating evidence might be required, as well. Without that technology, however, the claims were a little harder to confirm. It's not like there was an exact marker at the spot, because nobody had been there before. And unlike the peak of Mt. Everest, the landscape at the precise location of the North Pole doesn't look distinctly different from the rest of the terrain--for hundreds of miles in any given direction.
So the controversy has raged for a full century. But here's the interesting part. As more data about the expeditions, and about the North Pole, have emerged, it seems more and more likely that neither man actually reached the Pole. As John Tierney wrote recently in the Science Times, Peary supposedly took no celestial navigation readings on his final push to the Pole, until one day he took a single reading, looked very disappointed, and then declared that the observation--which he showed to no one--confirmed that he'd arrived at the North Pole, exactly. Cook had neither a trained celestial navigator nor the skill to make the observations himself. Without that skill, how on earth (so to speak) could he have reached the Pole, or known precisely when he was there? The modern-day consensus, according to Tierney, is that Peary got closer than Cook, but that neither man got closer than perhaps 100 miles away.
Yet a full century and much more advanced data analysis and evidence later, Peary and Cook still have ardent supporters who adamantly believe that their hero told the truth. They suggest that it might have been possible for either explorer to have found the Pole without clear celestial sightings, by studying wind patterns in the snow, or observing shadows, or even by compass, even though a compass needle gets extremely erratic near the Earth's poles. Apparently, some of the Peary/Cook advocates are more comfortable with contorted logic than simply acknowledging that, given more data, it appears their initial impression of things was ... ummm ... wrong.
Peary and Cook are not the only explorers to have die-hard believers who have clung to a set vision of their heroes' lives despite the emergence of countering evidence. David Roberts, an editor at National Geographic Adventure, encountered a startling backlash of anger and even threats after writing a feature article last spring (which he's expanded into a soon-to-be-released book) that solved the mystery of a young adventurer's disappearance--but not the way some of the adventurer's admirers wanted it solved.
In 1934, at the age of 20, Everett Reuss left civilization to go live in the wilderness ... and was never heard from again. A whole folk myth movement sprang up around this young man who seemed to have slipped so completely into the wild that he eluded discovery for the rest of his life. An annual art festival in Escalante, Utah, is even named in his honor. But Roberts, who researched the case for 10 years, finally discovered evidence that Ruess had been murdered by two members of the Ute tribe almost as soon as he'd begun his journey. There was a witness to the murder, an unearthed skeleton, and DNA tests that were compatible with other family members.
The mystery, it seemed, had been solved. But the hue and cry surrounding Roberts' piece was both angry and loud, catching both Roberts and the Reuss family by surprise. "We all want our heroes to succeed," Reuss' nephew Brian surmised, in an attempt to explain the uproar. (A couple months ago, I wrote a longer essay about the Reuss controversy.)
Perhaps. But I now think there's more to the equation; tendencies that affect how we view information about not just heroes and adventurers, but also issues and events that affect local and national policy and action.
How is it that people can cling to an opinion or view of a person, event, issue of the world, despite being presented with clear or mounting data that contradicts that position? The easy answer, of course, is simply that people are irrational. But a closer look at some of the particular ways and reasons we're irrational offers some interesting food for thought.
In a recently published study, a group of researchers from Northwestern University, UNC Chapel HIll, SUNY Buffalo and Millsaps College found that people often employ an approach the researchers called "motivated reasoning" when sorting through new information or arguments, especially on controversial issues. Motivated reasoning is, as UCLA public policy professor Mark Kleiman put it, the equivalent of policy-driven data, instead of data-driven policy.
In other words, if people start with a particular opinion or view on a subject, any counter-evidence can create "cognitive dissonance"--discomfort caused by the presence of two irreconcilable ideas in the mind at once. One way of resolving the dissonance would be to change or alter the originally held opinion. But the researchers found that many people instead choose to change the conflicting evidence--selectively seeking out information or arguments that support their position while arguing around or ignoring any opposing evidence, even if that means using questionable or contorted logic.
That's not a news flash to anyone who's paid attention to any recent national debate--although the researchers pointed out that this finding, itself, runs counter to the idea that the reason people continue to hold positions counter to all evidence is because of misinformation or lack of access to the correct data. Even when presented with compelling, factual data from sources they trusted, many of the subjects still found ways to dismiss it. But the most interesting (or disturbing) aspect of the Northwestern study was the finding that providing additional counter-evidence, facts, or arguments actually intensified this reaction. Additional countering data, it seems, increases the cognitive dissonance, and therefore the need for subjects to alleviate that discomfort by retreating into more rigidly selective hearing and entrenched positions.
Needless to say, these findings do not bode well for anyone with hopes of changing anyone else's mind with facts or rational discussion, especially on "hot button" issues. But why do we cling so fiercely to positions when they don't even involve us directly? Why do we care who got to the North Pole first? Or whether a particular bill has provision X versus provision Y in it? Why don't we care more about simply finding out the truth--especially in cases where one "right" answer actually exists?
Part of the reason, according to Kleiman, is "the brute fact that people identify their opinions with themselves; to admit having been wrong is to have lost the argument, and (as Vince Lombardi said), every time you lose, you die a little." And, he adds, "there is no more destructive force in human affairs--not greed, not hatred--than the desire to have been right."
So, what do we do about that? If overcoming "the desire to have been right" is half as challenging as overcoming hate or greed, the outlook doesn't seem promising. But Kleiman, who specializes in crime control policy and alternative solutions to very sticky problems (his latest book is "When Brute Force Fails: How to Have Less Crime and Less Punishment"), thinks all is not lost. He points to the philosopher Karl Popper, who, he says, believed fiercely in the discipline and teaching of critical thinking, because "it allows us to offer up our opinions as a sacrifice, so that they die in our stead."
A liberal education, Kleiman says, "ought, above all, to be an education in non-attachment to one's current opinions. I would define a true intellectual as one who cares terribly about being right, and not at all about having been right." Easy to say, very hard to achieve. For all sorts of reasons. But it's worth thinking about. Even if it came at the cost of sacrificing or altering our most dearly-held opinions ... the truth might set us free.
Photo Credit: Flickr User Lanz, photolib.noaa.gov, Wikimedia Commons
Marijuana users had smaller waists and scored higher across several measures of blood sugar regulation.
"Marijuana use is associated with an acute increase in caloric intake," goes the clinical jargon for popular lore. Still despite eating more while high (by some measures, over 600 extra calories per day), marijuana users' extra intake doesn't seem to be reflected in increased
BMI. Indeed, studies have identified a reduced prevalence of obesity in the pot smoking community.
Researchers at the University of Nebraska, the Harvard School of Public Health, and Beth Israel Deaconess Medical Center analyzed data from a
nationally representative sample of over 4,600 adults. About 12 percent of the participants self-identified as current marijuana users, and
another 42 percent reported having used the drug in the past. The participants were tested for various measures of blood sugar control: their fasting
insulin and glucose levels; insulin resistance; cholesterol levels; and waist circumference.
Martin O'Malley jumped into the race for the Democratic nomination on Saturday, giving Hillary Clinton another challenger.
For months, it looked like Martin O’Malley might be the only person brave enough to challenge Hillary Clinton for the 2016 Democratic nomination. Between her dominance and the Clintons’ legendarily long memory for slights, she seemed to have convinced most potential rivals not to bother.
But the Democratic field that the former Maryland governor joined on Saturday doesn’t look quite like what was expected. Yes, Clinton still has a comfortable lead over all rivals. But the rest of the ballot is more crowded. Jim Webb seems set to run. Lincoln Chafee is slated to announce a run on June 3. Most of all, Senator Bernie Sanders has become an unexpected force in the race.
The Sanders ascendancy is a challenge for O’Malley, who seemed to be aiming for the territory to Clinton’s left; O’Malley has criticized the former secretary of state over the Trans-Pacific Partnership. Another challenge is the recent unrest in Baltimore. Critics charge that data-based policing tactics that O’Malley ushered in as mayor helped create the tension between police and citizens that boiled over after the death of Freddie Gray. By announcing his campaign in Baltimore, O’Malley signaled that he intends to take that criticism on head-on. In statements since rioting and protests, he has suggested that such tensions are in fact exactly why he feels compelled to run.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
We're all going to die and we all know it. This can be both a burden and a blessing.
In the heart of every parent lives the tightly coiled nightmare that his child will die. It might spring at logical times—when a toddler runs into the street, say—or it might sneak up in quieter moments. The fear is a helpful evolutionary motivation for parents to protect their children, but it's haunting nonetheless.
The ancient Stoic philosopher Epictetus advised parents to indulge that fear. “What harm is it, just when you are kissing your little child, to say: Tomorrow you will die?”he wrote in his Discourses.
Some might say Epictetus was an asshole. William Irvine thinks he was on to something.
“The Stoics had the insight that the prospect of death can actually make our lives much happier than they would otherwise be,” he says. “You’re supposed to allow yourself to have a flickering thought that someday you’re going to die, and someday the people you love are going to die. I’ve tried it, and it’s incredibly powerful. Well, I am a 21st-century practicing Stoic.”
The plight of non-tenured professors is widely known, but what about the impact they have on the students they’re hired to instruct?
Imagine meeting your English professor by the trunk of her car for office hours, where she doles out information like a taco vendor in a food truck. Or getting an e-mail error message when you write your former biology professor asking for a recommendation because she is no longer employed at the same college. Or attending an afternoon lecture in which your anthropology professor seems a little distracted because he doesn’t have enough money for bus fare. This is an increasingly widespread reality of college education.
Many students—and parents who foot the bills—may assume that all college professors are adequately compensated professionals with a distinct arrangement in which they have a job for life. In actuality those are just tenured professors, who represent less than a quarter of all college faculty. Odds are that students will be taught by professors with less job security and lower pay than those tenured employees, which research shows results in diminished services for students.
Can a political system be democratically legitimate without being democratic?
The flaws in China’s political system are obvious. The government doesn’t even make a pretense of holding national elections and punishes those who openly call for multiparty rule. The press is heavily censored and the Internet is blocked. Top leaders are unconstrained by the rule of law. Even more worrisome, repression has been ramped up since Xi Jinping took power in 2012, suggesting that the regime is increasingly worried about its legitimacy.
Some China experts—most recently David Shambaugh of George Washington University—interpret these ominous signs as evidence that the Chinese political system is on the verge of collapse. But such an outcome is highly unlikely in the near future. The Communist Party is firmly in power, its top leader is popular, and no political alternative currently claims widespread support. And what would happen if the Party’s power did indeed crumble? The most likely result, in my view, would be rule by a populist strongman backed by elements of the country’s security and military forces. The new ruler might seek to buttress his legitimacy by launching military adventures abroad. President Xi would look tame by comparison.
People look to Amy Schumer and her fellow jokers not just to make fun of the world, but to make sense of it. And maybe even to help fix it.
This week, in a much-anticipated sketch on her Comedy Central show, Amy Schumer staged a trial of Bill Cosby in “the court of public opinion.” Schumer—her character, at any rate—played the role of the defense. “Let’s remind ourselves what’s at stake here,” she argued to the jury. “If convicted, the next time you put on a rerun of The Cosby Show you may wince a little. Might feel a little pang. And none of us deserve that. We don’t deserve to feel that pang.”
Her conclusion? “We deserve to dance like no one’s watching, and watch like no one’s raping.”
Ooof. This is the kind of thing that gets Inside Amy Schumer referred to as “the most feminist show on television,” and her act in general called, in a phrase that reveals as much about her craft as about Schumer herself, “comedy with a message.” But while Schumer’s work is operating at the vanguard of popular comedy, it’s also in line with the work being done by her fellow performers: jokes that tend to treat humor not just as an end in itself, but as a vehicle for making a point. Watch like no one’s raping.
New research confirms what they say about nice guys.
Smile at the customer. Bake cookies for your colleagues. Sing your subordinates’ praises. Share credit. Listen. Empathize. Don’t drive the last dollar out of a deal. Leave the last doughnut for someone else.
Sneer at the customer. Keep your colleagues on edge. Claim credit. Speak first. Put your feet on the table. Withhold approval. Instill fear. Interrupt. Ask for more. And by all means, take that last doughnut. You deserve it.
Follow one of those paths, the success literature tells us, and you’ll go far. Follow the other, and you’ll die powerless and broke. The only question is, which is which?
Of all the issues that preoccupy the modern mind—Nature or nurture? Is there life in outer space? Why can’t America field a decent soccer team?—it’s hard to think of one that has attracted so much water-cooler philosophizing yet so little scientific inquiry. Does it pay to be nice? Or is there an advantage to being a jerk?
Caves and tunnels have always been part of human life.
Caves and tunnels have always been part of human life. We've grown more adept at shaping these underground shelters and passages over the millennia, and today we dig for hundreds of reasons. We excavate to find both literal and cultural treasures, digging mines and unearthing archaeological discoveries. We use caverns for stable storage, for entertainment, and for an effective shelter from natural and man-made disasters. And as the planet's surface becomes ever more crowded, and national borders are closed, tunnels provide pathways for our vehicles and for smugglers of every kind. Collected below are more recent subterranean scenes from around the world.