One hundred years ago this month, two intrepid explorers returned from the Arctic reaches and declared that they had reached the North Pole. Not together, but on competing expeditions to become the first person and team to the Pole. Robert E. Peary led one expedition, and Frederick A. Cook led the other. And each declared the other's claim to the Pole untrue.
Today, of course, that kind of controversy could be settled far more easily. At the very least, we would expect a GPS track record showing that the Pole had been reached, and airborne photographs or other corroborating evidence might be required, as well. Without that technology, however, the claims were a little harder to confirm. It's not like there was an exact marker at the spot, because nobody had been there before. And unlike the peak of Mt. Everest, the landscape at the precise location of the North Pole doesn't look distinctly different from the rest of the terrain--for hundreds of miles in any given direction.
So the controversy has raged for a full century. But here's the interesting part. As more data about the expeditions, and about the North Pole, have emerged, it seems more and more likely that neither man actually reached the Pole. As John Tierney wrote recently in the Science Times, Peary supposedly took no celestial navigation readings on his final push to the Pole, until one day he took a single reading, looked very disappointed, and then declared that the observation--which he showed to no one--confirmed that he'd arrived at the North Pole, exactly. Cook had neither a trained celestial navigator nor the skill to make the observations himself. Without that skill, how on earth (so to speak) could he have reached the Pole, or known precisely when he was there? The modern-day consensus, according to Tierney, is that Peary got closer than Cook, but that neither man got closer than perhaps 100 miles away.
Yet a full century and much more advanced data analysis and evidence later, Peary and Cook still have ardent supporters who adamantly believe that their hero told the truth. They suggest that it might have been possible for either explorer to have found the Pole without clear celestial sightings, by studying wind patterns in the snow, or observing shadows, or even by compass, even though a compass needle gets extremely erratic near the Earth's poles. Apparently, some of the Peary/Cook advocates are more comfortable with contorted logic than simply acknowledging that, given more data, it appears their initial impression of things was ... ummm ... wrong.
Peary and Cook are not the only explorers to have die-hard believers who have clung to a set vision of their heroes' lives despite the emergence of countering evidence. David Roberts, an editor at National Geographic Adventure, encountered a startling backlash of anger and even threats after writing a feature article last spring (which he's expanded into a soon-to-be-released book) that solved the mystery of a young adventurer's disappearance--but not the way some of the adventurer's admirers wanted it solved.
In 1934, at the age of 20, Everett Reuss left civilization to go live in the wilderness ... and was never heard from again. A whole folk myth movement sprang up around this young man who seemed to have slipped so completely into the wild that he eluded discovery for the rest of his life. An annual art festival in Escalante, Utah, is even named in his honor. But Roberts, who researched the case for 10 years, finally discovered evidence that Ruess had been murdered by two members of the Ute tribe almost as soon as he'd begun his journey. There was a witness to the murder, an unearthed skeleton, and DNA tests that were compatible with other family members.
The mystery, it seemed, had been solved. But the hue and cry surrounding Roberts' piece was both angry and loud, catching both Roberts and the Reuss family by surprise. "We all want our heroes to succeed," Reuss' nephew Brian surmised, in an attempt to explain the uproar. (A couple months ago, I wrote a longer essay about the Reuss controversy.)
Perhaps. But I now think there's more to the equation; tendencies that affect how we view information about not just heroes and adventurers, but also issues and events that affect local and national policy and action.
How is it that people can cling to an opinion or view of a person, event, issue of the world, despite being presented with clear or mounting data that contradicts that position? The easy answer, of course, is simply that people are irrational. But a closer look at some of the particular ways and reasons we're irrational offers some interesting food for thought.
In a recently published study, a group of researchers from Northwestern University, UNC Chapel HIll, SUNY Buffalo and Millsaps College found that people often employ an approach the researchers called "motivated reasoning" when sorting through new information or arguments, especially on controversial issues. Motivated reasoning is, as UCLA public policy professor Mark Kleiman put it, the equivalent of policy-driven data, instead of data-driven policy.
In other words, if people start with a particular opinion or view on a subject, any counter-evidence can create "cognitive dissonance"--discomfort caused by the presence of two irreconcilable ideas in the mind at once. One way of resolving the dissonance would be to change or alter the originally held opinion. But the researchers found that many people instead choose to change the conflicting evidence--selectively seeking out information or arguments that support their position while arguing around or ignoring any opposing evidence, even if that means using questionable or contorted logic.
That's not a news flash to anyone who's paid attention to any recent national debate--although the researchers pointed out that this finding, itself, runs counter to the idea that the reason people continue to hold positions counter to all evidence is because of misinformation or lack of access to the correct data. Even when presented with compelling, factual data from sources they trusted, many of the subjects still found ways to dismiss it. But the most interesting (or disturbing) aspect of the Northwestern study was the finding that providing additional counter-evidence, facts, or arguments actually intensified this reaction. Additional countering data, it seems, increases the cognitive dissonance, and therefore the need for subjects to alleviate that discomfort by retreating into more rigidly selective hearing and entrenched positions.
Needless to say, these findings do not bode well for anyone with hopes of changing anyone else's mind with facts or rational discussion, especially on "hot button" issues. But why do we cling so fiercely to positions when they don't even involve us directly? Why do we care who got to the North Pole first? Or whether a particular bill has provision X versus provision Y in it? Why don't we care more about simply finding out the truth--especially in cases where one "right" answer actually exists?
Part of the reason, according to Kleiman, is "the brute fact that people identify their opinions with themselves; to admit having been wrong is to have lost the argument, and (as Vince Lombardi said), every time you lose, you die a little." And, he adds, "there is no more destructive force in human affairs--not greed, not hatred--than the desire to have been right."
So, what do we do about that? If overcoming "the desire to have been right" is half as challenging as overcoming hate or greed, the outlook doesn't seem promising. But Kleiman, who specializes in crime control policy and alternative solutions to very sticky problems (his latest book is "When Brute Force Fails: How to Have Less Crime and Less Punishment"), thinks all is not lost. He points to the philosopher Karl Popper, who, he says, believed fiercely in the discipline and teaching of critical thinking, because "it allows us to offer up our opinions as a sacrifice, so that they die in our stead."
A liberal education, Kleiman says, "ought, above all, to be an education in non-attachment to one's current opinions. I would define a true intellectual as one who cares terribly about being right, and not at all about having been right." Easy to say, very hard to achieve. For all sorts of reasons. But it's worth thinking about. Even if it came at the cost of sacrificing or altering our most dearly-held opinions ... the truth might set us free.
Photo Credit: Flickr User Lanz, photolib.noaa.gov, Wikimedia Commons
Long after research contradicts common medical practices, patients continue to demand them and physicians continue to deliver. The result is an epidemic of unnecessary and unhelpful treatments.
First, listen to the story with the happy ending: At 61, the executive was in excellent health. His blood pressure was a bit high, but everything else looked good, and he exercised regularly. Then he had a scare. He went for a brisk post-lunch walk on a cool winter day, and his chest began to hurt. Back inside his office, he sat down, and the pain disappeared as quickly as it had come.
That night, he thought more about it: middle-aged man, high blood pressure, stressful job, chest discomfort. The next day, he went to a local emergency department. Doctors determined that the man had not suffered a heart attack and that the electrical activity of his heart was completely normal. All signs suggested that the executive had stable angina—chest pain that occurs when the heart muscle is getting less blood-borne oxygen than it needs, often because an artery is partially blocked.
Plagues, revolutions, massive wars, collapsed states—these are what reliably reduce economic disparities.
Calls to make America great again hark back to a time when income inequality receded even as the economy boomed and the middle class expanded. Yet it is all too easy to forget just how deeply this newfound equality was rooted in the cataclysm of the world wars.
The pressures of total war became a uniquely powerful catalyst of equalizing reform, spurring unionization, extensions of voting rights, and the creation of the welfare state. During and after wartime, aggressive government intervention in the private sector and disruptions to capital holdings wiped out upper-class wealth and funneled resources to workers; even in countries that escaped physical devastation and crippling inflation, marginal tax rates surged upward. Concentrated for the most part between 1914 and 1945, this “Great Compression” (as economists call it) of inequality took several more decades to fully run its course across the developed world until the 1970s and 1980s, when it stalled and began to go into reverse.
Two historians weigh in on how to understand the new administration, press relations, and this moment in political time.
The election of Donald Trump, and the early days of his presidency, have driven many Americans to rummage through history in search of context and understanding. Trump himself has been compared to historical figures ranging from Ronald Reagan to Henry Ford, and from Andrew Jackson to Benito Mussolini. His steps have been condemned as unprecedented by his critics, and praised as historic by his supporters.
To place contemporary events in perspective, we turned to a pair of historians of the United States. Julian Zelizer is a professor of history and public affairs at Princeton University. He is the author, most recently, of The Fierce Urgency of Now: Lyndon Johnson, Congress, and the Battle for the Great Society. Morton Keller is a professor emeritus of history at Brandeis University. He has written or edited more than 15 books, including Obama’s Time: A History. They’ll be exchanging views periodically on how to understand Trump, his presidency, and this moment in political time. —Yoni Appelbaum
“The question confronting us as a nation is as consequential as any we have faced since the late 1940s,” a group of Republican and Democratic experts write.
Ben Rhodes, one of Barack Obama’s top advisers, once dismissed the American foreign-policy establishment—those ex-government officials and think-tank scholars and journalists in Washington, D.C. who advocate for a particular vision of assertive U.S. leadership in the world—as the “Blob.” Donald Trump had harsher words. As a presidential candidate, he vowed never to take advice on international affairs from “those who have perfect resumes but very little to brag about except responsibility for a long history of failed policies and continued losses at war.” Both men pointed to one of the Beltway establishment’s more glaring errors: support for the war in Iraq.
Now the Blob is fighting back. The “establishment” has been unfairly “kicked around,” said Robert Kagan, a senior fellow at the Brookings Institution and former official in the Reagan administration. As World War II gave way to the Cold War, President Harry Truman and his secretary of state, Dean Acheson, “invented a foreign policy and sold it successfully to the American people. That’s what containment was and that’s what the Truman Doctrine was. … That was the foreign-policy establishment.” During that period, the U.S. government also helped create a system for restoring order to a world riven by war and economic crisis. That system, which evolved over the course of the Cold War and post-Cold War period, includes an open international economy; U.S. military and diplomatic alliances in Asia, Europe, and the Middle East; and liberal rules and institutions (human rights, the United Nations, and so on).
A $100 million gangster epic starring Robert De Niro, Al Pacino, and Joe Pesci has become too risky a proposition for major studios.
Martin Scorsese’s next project, The Irishman, is as close as you can get to a box-office guarantee for the famed director. It’s a gangster film based on a best-selling book about a mob hitman who claimed to have a part in the legendary disappearance of the union boss Jimmy Hoffa. Robert De Niro is attached to play the hitman, Al Pacino will star as Hoffa, and Scorsese favorites Joe Pesci and Harvey Keitel are also on board. After Scorsese branched into more esoteric territory this year with Silence, a meditative exploration of faith and Catholicism, The Irishman sounds like a highly bankable project—the kind studios love. And yet, the film is going to Netflix, which will bankroll its $100 million budget and distribute it around the world on the company’s streaming service.
In late 2015, in the Chilean desert, astronomers pointed a telescope at a faint, nearby star known as ared dwarf. Amid the star’s dim infrared glow, they spotted periodic dips, a telltale sign that something was passing in front of it, blocking its light every so often. Last summer, the astronomers concluded the mysterious dimming came from three Earth-sized planets—and that they were orbiting in the star’s temperate zone, where temperatures are not too hot, and not too cold, but just right for liquid water, and maybe even life.
This was an important find. Scientists for years had focused on stars like our sun in their search for potentially habitable planets outside our solar system. Red dwarfs, smaller and cooler than the sun, were thought to create inhospitable conditions. They’re also harder to see, detectable by infrared rather than visible light. But the astronomers aimed hundreds of hours worth of observations at this dwarf, known as TRAPPIST-1 anyway, using ground-based telescopes around the world and NASA’s Spitzer Space Telescope.
High-school textbooks too often gloss over the American government’s oppression of racial minorities.
Earlier this month, McGraw Hill found itself at the center of some rather embarrassing press after a photo showing a page from one of its high-school world-geography textbooks was disseminated on social media. The page features a seemingly innocuous polychromatic map of the United States, broken up into thousands of counties, as part of a lesson on the country’s immigration patterns: Different colors correspond with various ancestral groups, and the color assigned to each county indicates its largest ethnic representation. The page is scarce on words aside from an introductory summary and three text bubbles explaining specific trends—for example, that Mexico accounts for the largest share of U.S. immigrants today.
Neither truck drivers nor bankers would put up with a system like the one that influences medical residents’ schedules.
The path to becoming a doctor is notoriously difficult. Following pre-med studies and four years of medical school, freshly minted M.D.s must spend anywhere from three to seven years (depending on their chosen specialty) training as “residents” at an established teaching hospital. Medical residencies are institutional apprenticeships—and are therefore structured to serve the dual, often dueling, aims of training the profession’s next generation and minding the hospital’s labor needs.
How to manage this tension between “education and service” is a perennial question of residency training, according to Janis Orlowski, the chief health-care officer of the Association of American Medical Colleges (AAMC). Orlowski says that the amount of menial labor residents are required to perform, known in the profession as “scut work,” has decreased "tremendously" since she was a resident in the 1980s. But she acknowledges that even "institutions that are committed to education … constantly struggle with this,” trying to stay on the right side of the boundary between training and taking advantage of residents.
You can tell a lot about a person from how they react to something.
That’s why Facebook’s various “Like” buttons are so powerful. Clicking a reaction icon isn’t just a way to register an emotional response, it’s also a way for Facebook to refine its sense of who you are. So when you “Love” a photo of a friend’s baby, and click “Angry” on an article about the New England Patriots winning the Super Bowl, you’re training Facebook to see you a certain way: You are a person who seems to love babies and hate Tom Brady.
The more you click, the more sophisticated Facebook’s idea of who you are becomes. (Remember: Although the reaction choices seem limited now—Like, Love, Haha, Wow, Sad, or Angry—up until around this time last year, there was only a “Like” button.)
Rod Dreher makes a powerful argument for communal religious life in his book, The Benedict Option. But he has not wrestled with how to live side by side with people unlike him.
Donald Trump was elected president with the help of 81 percent of white evangelical voters. Mike Pence, the champion of Indiana’s controversial 2015 religious-freedom law, is his deputy. Neil Gorsuch, a judge deeply sympathetic to religious litigants, will likely be appointed to the Supreme Court. And Republicans hold both chambers of Congress and statehouses across the country. Right now, conservative Christians enjoy more influence on American politics than they have in decades.
And yet, Rod Dreher is terrified.
“Don’t be fooled,” he tells fellow Christians in his new book, The Benedict Option. “The upset presidential victory of Donald Trump has at best given us a bit more time to prepare for the inevitable.”