One hundred years ago this month, two intrepid explorers returned from the Arctic reaches and declared that they had reached the North Pole. Not together, but on competing expeditions to become the first person and team to the Pole. Robert E. Peary led one expedition, and Frederick A. Cook led the other. And each declared the other's claim to the Pole untrue.
Today, of course, that kind of controversy could be settled far more easily. At the very least, we would expect a GPS track record showing that the Pole had been reached, and airborne photographs or other corroborating evidence might be required, as well. Without that technology, however, the claims were a little harder to confirm. It's not like there was an exact marker at the spot, because nobody had been there before. And unlike the peak of Mt. Everest, the landscape at the precise location of the North Pole doesn't look distinctly different from the rest of the terrain--for hundreds of miles in any given direction.
So the controversy has raged for a full century. But here's the interesting part. As more data about the expeditions, and about the North Pole, have emerged, it seems more and more likely that neither man actually reached the Pole. As John Tierney wrote recently in the Science Times, Peary supposedly took no celestial navigation readings on his final push to the Pole, until one day he took a single reading, looked very disappointed, and then declared that the observation--which he showed to no one--confirmed that he'd arrived at the North Pole, exactly. Cook had neither a trained celestial navigator nor the skill to make the observations himself. Without that skill, how on earth (so to speak) could he have reached the Pole, or known precisely when he was there? The modern-day consensus, according to Tierney, is that Peary got closer than Cook, but that neither man got closer than perhaps 100 miles away.
Yet a full century and much more advanced data analysis and evidence later, Peary and Cook still have ardent supporters who adamantly believe that their hero told the truth. They suggest that it might have been possible for either explorer to have found the Pole without clear celestial sightings, by studying wind patterns in the snow, or observing shadows, or even by compass, even though a compass needle gets extremely erratic near the Earth's poles. Apparently, some of the Peary/Cook advocates are more comfortable with contorted logic than simply acknowledging that, given more data, it appears their initial impression of things was ... ummm ... wrong.
Peary and Cook are not the only explorers to have die-hard believers who have clung to a set vision of their heroes' lives despite the emergence of countering evidence. David Roberts, an editor at National Geographic Adventure, encountered a startling backlash of anger and even threats after writing a feature article last spring (which he's expanded into a soon-to-be-released book) that solved the mystery of a young adventurer's disappearance--but not the way some of the adventurer's admirers wanted it solved.
In 1934, at the age of 20, Everett Reuss left civilization to go live in the wilderness ... and was never heard from again. A whole folk myth movement sprang up around this young man who seemed to have slipped so completely into the wild that he eluded discovery for the rest of his life. An annual art festival in Escalante, Utah, is even named in his honor. But Roberts, who researched the case for 10 years, finally discovered evidence that Ruess had been murdered by two members of the Ute tribe almost as soon as he'd begun his journey. There was a witness to the murder, an unearthed skeleton, and DNA tests that were compatible with other family members.
The mystery, it seemed, had been solved. But the hue and cry surrounding Roberts' piece was both angry and loud, catching both Roberts and the Reuss family by surprise. "We all want our heroes to succeed," Reuss' nephew Brian surmised, in an attempt to explain the uproar. (A couple months ago, I wrote a longer essay about the Reuss controversy.)
Perhaps. But I now think there's more to the equation; tendencies that affect how we view information about not just heroes and adventurers, but also issues and events that affect local and national policy and action.
How is it that people can cling to an opinion or view of a person, event, issue of the world, despite being presented with clear or mounting data that contradicts that position? The easy answer, of course, is simply that people are irrational. But a closer look at some of the particular ways and reasons we're irrational offers some interesting food for thought.
In a recently published study, a group of researchers from Northwestern University, UNC Chapel HIll, SUNY Buffalo and Millsaps College found that people often employ an approach the researchers called "motivated reasoning" when sorting through new information or arguments, especially on controversial issues. Motivated reasoning is, as UCLA public policy professor Mark Kleiman put it, the equivalent of policy-driven data, instead of data-driven policy.
In other words, if people start with a particular opinion or view on a subject, any counter-evidence can create "cognitive dissonance"--discomfort caused by the presence of two irreconcilable ideas in the mind at once. One way of resolving the dissonance would be to change or alter the originally held opinion. But the researchers found that many people instead choose to change the conflicting evidence--selectively seeking out information or arguments that support their position while arguing around or ignoring any opposing evidence, even if that means using questionable or contorted logic.
That's not a news flash to anyone who's paid attention to any recent national debate--although the researchers pointed out that this finding, itself, runs counter to the idea that the reason people continue to hold positions counter to all evidence is because of misinformation or lack of access to the correct data. Even when presented with compelling, factual data from sources they trusted, many of the subjects still found ways to dismiss it. But the most interesting (or disturbing) aspect of the Northwestern study was the finding that providing additional counter-evidence, facts, or arguments actually intensified this reaction. Additional countering data, it seems, increases the cognitive dissonance, and therefore the need for subjects to alleviate that discomfort by retreating into more rigidly selective hearing and entrenched positions.
Needless to say, these findings do not bode well for anyone with hopes of changing anyone else's mind with facts or rational discussion, especially on "hot button" issues. But why do we cling so fiercely to positions when they don't even involve us directly? Why do we care who got to the North Pole first? Or whether a particular bill has provision X versus provision Y in it? Why don't we care more about simply finding out the truth--especially in cases where one "right" answer actually exists?
Part of the reason, according to Kleiman, is "the brute fact that people identify their opinions with themselves; to admit having been wrong is to have lost the argument, and (as Vince Lombardi said), every time you lose, you die a little." And, he adds, "there is no more destructive force in human affairs--not greed, not hatred--than the desire to have been right."
So, what do we do about that? If overcoming "the desire to have been right" is half as challenging as overcoming hate or greed, the outlook doesn't seem promising. But Kleiman, who specializes in crime control policy and alternative solutions to very sticky problems (his latest book is "When Brute Force Fails: How to Have Less Crime and Less Punishment"), thinks all is not lost. He points to the philosopher Karl Popper, who, he says, believed fiercely in the discipline and teaching of critical thinking, because "it allows us to offer up our opinions as a sacrifice, so that they die in our stead."
A liberal education, Kleiman says, "ought, above all, to be an education in non-attachment to one's current opinions. I would define a true intellectual as one who cares terribly about being right, and not at all about having been right." Easy to say, very hard to achieve. For all sorts of reasons. But it's worth thinking about. Even if it came at the cost of sacrificing or altering our most dearly-held opinions ... the truth might set us free.
Photo Credit: Flickr User Lanz, photolib.noaa.gov, Wikimedia Commons
The federal government will likely reopen by Tuesday after Senate Democrats accepted an offer from Majority Leader Mitch McConnell to end their filibuster of a stopgap spending bill.
Updated on January 22 at 4:51 p.m. ET
Senate Democrats have given in.
A three-day shutdown of the federal government is about to end after Senate Democrats dropped their filibuster of a stopgap spending bill and accepted an offer from the Republican leadership to debate an immigration proposal by early February.
“The Republican leader and I have come to an arrangement: We will vote today to reopen the government,” Senate Minority Leader Charles Schumer said early Monday afternoon.
An overwhelming majority of the Senate voted, 81-18, early Monday afternoon to advance legislation to fund the government for the next three weeks, through February 8. A final version cleared the chamber on an identical vote later in the afternoon, and House Republican leaders have indicated they’ll swiftly pass the measure and send it to President Trump for his signature.
A British broadcaster doggedly tried to put words into the academic’s mouth.
My first introduction to Jordan B. Peterson, a University of Toronto clinical psychologist, came by way of an interview that began trending on social media last week. Peterson was pressed by the British journalist Cathy Newman to explain several of his controversial views. But what struck me, far more than any position he took, was the method his interviewer employed. It was the most prominent, striking example I’ve seen yet of an unfortunate trend in modern communication.
First, a person says something. Then, another person restates what they purportedly said so as to make it seem as if their view is as offensive, hostile, or absurd.
Twitter, Facebook, Tumblr, and various Fox News hosts all feature and reward this rhetorical technique. And the Peterson interview has so many moments of this kind that each successive example calls attention to itself until the attentive viewer can’t help but wonder what drives the interviewer to keep inflating the nature of Peterson’s claims, instead of addressing what he actually said.
Their peaceful premises and intricate rule systems are changing the way Americans play—and helping shape an industry in the process.
In a development that would have been hard to imagine a generation ago, when video games were poised to take over living rooms, board games are thriving. Overall, the latest available data shows that U.S. sales grew by 28 percent between the spring of 2016 and the spring of 2017. Revenues are expected to rise at a similar rate into the early 2020s—largely, says one analyst, because the target audience “has changed from children to adults,” particularly younger ones.
Much of this success is traceable to the rise of games that, well, get those adults acting somewhat more like children. Clever, low-overhead card games such as Cards Against Humanity, Secret Hitler, and Exploding Kittens (“A card game for people who are into kittens and explosions”) have sold exceptionally well. Games like these have proliferated on Kickstarter, where anyone with a great idea and a contact at an industrial printing company can circumvent the usual toy-and-retail gatekeepers who green-light new concepts. (The largest project category on Kickstarter is “Games,” and board games make up about three-quarters of those projects.)
When cities compete to attract big employers, the country as a whole suffers.
Since Amazon announced last year that it is going to build a second corporate campus, cities—238 of them in North America, in three countries—quickly started courting the company. They scrambled to propose the most generous package of financial incentives they could muster, in hopes of luring the online-retailing and cloud-computing giant.
On Thursday, Amazon announced that it had whittled its list down to 20 finalist cities spanning the country, from Los Angeles to Austin to Boston and Miami. What does the future hold for the lucky winner? In Amazon’s request for proposals, it dangled the promise of hiring up to 50,000 full-time employees (at an average salary of more than $100,000 a year) over the next 10 or 15 years, and spending $5 billion in the process of executing the project.
The Senate struck a deal to reopen the government on Monday morning—but without any help from President Trump.
If ever there were a time for a dealmaker in Washington, this weekend was it. Friday, as a shutdown loomed, it seemed as though Republicans and Democrats would be able to reach some accommodation to fund the government, but in the wake of that failure, the mood turned bitter over the weekend.
With leaders in Congress at an impasse, the most logical person to step in and broker an arrangement was the president of the United States. That’s usually the case, but it’s especially true now, with a president whose name, thanks to his first book, is practically synonymous with deals. And yet, Donald Trump remained strangely absent. Oh, sure, the president was tweeting, but he offered mostly uncharacteristically bland restatements of the White House line that it was all Democrats’ fault. After meeting with Democratic leader Chuck Schumer on Friday, Trump stayed largely on the sidelines.
Allegations against the comedian are proof that women are angry, temporarily powerful—and very, very dangerous.
Sexual mores in the West have changed so rapidly over the past 100 years that by the time you reach 50, intimate accounts of commonplace sexual events of the young seem like science fiction: You understand the vocabulary and the sentence structure, but all of the events take place in outer space. You’re just too old.
This was my experience reading the account of one young woman’s alleged sexual encounter with Aziz Ansari, published by the website Babe this weekend. The world in which it constituted an episode of sexual assault was so far from my own two experiences of near date rape (which took place, respectively, during the Carter and Reagan administrations, roughly between the kidnapping of the Iran hostages and the start of the Falklands War) that I just couldn’t pick up the tune. But, like the recent New Yorker story “Cat Person”—about a soulless and disappointing hookup between two people who mostly knew each other through texts—the account has proved deeply resonant and meaningful to a great number of young women, who have responded in large numbers on social media, saying that it is frighteningly and infuriatingly similar to crushing experiences of their own. It is therefore worth reading and, in its way, is an important contribution to the present conversation.
When truth itself feels uncertain, how can a democracy be sustained?
“In God We Trust,” goes the motto of the United States. In God, and apparently little else.
Only a third of Americans now trust their government “to do what is right”—a decline of 14 percentage points from last year, according to a new report by the communications marketing firm Edelman. Forty-two percent trust the media, relative to 47 percent a year ago. Trust in business and non-governmental organizations, while somewhat higher than trust in government and the media, decreased by 10 and nine percentage points, respectively. Edelman, which for 18 years has been asking people around the world about their level of trust in various institutions, has never before recorded such steep drops in trust in the United States.
After a rocky start in theaters, the Hugh Jackman–starring circus musical has become a massive word-of-mouth hit.
The hottest box-office story in Hollywood right now isn’t Star Wars: The Last Jedi, which made more than $600 million in the U.S. and became the sixth biggest hit in movie history. It isn’t the surprising success of Jumanji: Welcome to the Jungle, an unambiguous smash that has cemented the star power of Dwayne Johnson and Kevin Hart. No, the most interesting film in last weekend’s returns was The Greatest Showman—the family-friendly original musical about P.T. Barnum starring Hugh Jackman that has now made $113 million in five weekends. It was a risky proposition of a movie that got mediocre reviews and initially generated little excitement from audiences. Now, it’s one of the largestword-of-mouth hits in Hollywood history. So what happened?
Entertainment glorifying or excusing predatory male behavior is everywhere—from songs about “blurred lines” to TV shows where rapists marry their victims.
Edward Cullen. Chuck Bass. Lloyd Dobler. Spike from Buffy the Vampire Slayer. That guy from Love Actually with the sign. The lead singers of emo bands with their brooding lyrics. Many of the romantic heroes that made me swoon in my youth followed a pattern and, like a Magic Eye picture, only with a little distance did the shape of it pop out to me. All of these characters in some way crossed, or at least blurred, the lines of consent, aggressively pursuing women with little or no regard for their desires. But these characters’ actions, and those of countless other leading men across the pop-culture landscape, were more likely to be portrayed as charming than scary.
Romance often involves a bit of pursuit—someone has to make a move, after all. And there’s certainly a spectrum of pursuit: Sometimes supposedly romantic gestures in pop culture veer toward the horrendous or illegal; sometimes they’re just a bit creepy or overzealous. But revisiting some of these fictional love stories can leave one with the understanding that intrusive attention is proof of men’s passion, and something women should welcome. In a number of cases, male characters who were acknowledged to have gone too far—by, for example, actually forcing themselves on women—were quickly forgiven, or their actions compartmentalized and forgotten.
Corporate goliaths are taking over the U.S. economy. Yet small breweries are thriving. Why?
The monopolies are coming. In almost every economic sector, including television, books, music, groceries, pharmacies, and advertising, a handful of companies control a prodigious share of the market.
The beer industry has been one of the worst offenders. The refreshing simplicity of Blue Moon, the vanilla smoothness of Boddingtons, the classic brightness of a Pilsner Urquell, and the bourbon-barrel stouts of Goose Island—all are owned by two companies: Anheuser-Busch InBev and MillerCoors. As recently as 2012, this duopoly controlled nearly 90 percent of beer production.
This sort of industry consolidation troubles economists. Research has found that the existence of corporate behemoths stamps out innovation and hurts workers. Indeed, between 2002 and 2007, employment at breweries actually declined in the midst of an economic expansion.