One hundred years ago this month, two intrepid explorers returned from the Arctic reaches and declared that they had reached the North Pole. Not together, but on competing expeditions to become the first person and team to the Pole. Robert E. Peary led one expedition, and Frederick A. Cook led the other. And each declared the other's claim to the Pole untrue.
Today, of course, that kind of controversy could be settled far more easily. At the very least, we would expect a GPS track record showing that the Pole had been reached, and airborne photographs or other corroborating evidence might be required, as well. Without that technology, however, the claims were a little harder to confirm. It's not like there was an exact marker at the spot, because nobody had been there before. And unlike the peak of Mt. Everest, the landscape at the precise location of the North Pole doesn't look distinctly different from the rest of the terrain--for hundreds of miles in any given direction.
So the controversy has raged for a full century. But here's the interesting part. As more data about the expeditions, and about the North Pole, have emerged, it seems more and more likely that neither man actually reached the Pole. As John Tierney wrote recently in the Science Times, Peary supposedly took no celestial navigation readings on his final push to the Pole, until one day he took a single reading, looked very disappointed, and then declared that the observation--which he showed to no one--confirmed that he'd arrived at the North Pole, exactly. Cook had neither a trained celestial navigator nor the skill to make the observations himself. Without that skill, how on earth (so to speak) could he have reached the Pole, or known precisely when he was there? The modern-day consensus, according to Tierney, is that Peary got closer than Cook, but that neither man got closer than perhaps 100 miles away.
Yet a full century and much more advanced data analysis and evidence later, Peary and Cook still have ardent supporters who adamantly believe that their hero told the truth. They suggest that it might have been possible for either explorer to have found the Pole without clear celestial sightings, by studying wind patterns in the snow, or observing shadows, or even by compass, even though a compass needle gets extremely erratic near the Earth's poles. Apparently, some of the Peary/Cook advocates are more comfortable with contorted logic than simply acknowledging that, given more data, it appears their initial impression of things was ... ummm ... wrong.
Peary and Cook are not the only explorers to have die-hard believers who have clung to a set vision of their heroes' lives despite the emergence of countering evidence. David Roberts, an editor at National Geographic Adventure, encountered a startling backlash of anger and even threats after writing a feature article last spring (which he's expanded into a soon-to-be-released book) that solved the mystery of a young adventurer's disappearance--but not the way some of the adventurer's admirers wanted it solved.
In 1934, at the age of 20, Everett Reuss left civilization to go live in the wilderness ... and was never heard from again. A whole folk myth movement sprang up around this young man who seemed to have slipped so completely into the wild that he eluded discovery for the rest of his life. An annual art festival in Escalante, Utah, is even named in his honor. But Roberts, who researched the case for 10 years, finally discovered evidence that Ruess had been murdered by two members of the Ute tribe almost as soon as he'd begun his journey. There was a witness to the murder, an unearthed skeleton, and DNA tests that were compatible with other family members.
The mystery, it seemed, had been solved. But the hue and cry surrounding Roberts' piece was both angry and loud, catching both Roberts and the Reuss family by surprise. "We all want our heroes to succeed," Reuss' nephew Brian surmised, in an attempt to explain the uproar. (A couple months ago, I wrote a longer essay about the Reuss controversy.)
Perhaps. But I now think there's more to the equation; tendencies that affect how we view information about not just heroes and adventurers, but also issues and events that affect local and national policy and action.
How is it that people can cling to an opinion or view of a person, event, issue of the world, despite being presented with clear or mounting data that contradicts that position? The easy answer, of course, is simply that people are irrational. But a closer look at some of the particular ways and reasons we're irrational offers some interesting food for thought.
In a recently published study, a group of researchers from Northwestern University, UNC Chapel HIll, SUNY Buffalo and Millsaps College found that people often employ an approach the researchers called "motivated reasoning" when sorting through new information or arguments, especially on controversial issues. Motivated reasoning is, as UCLA public policy professor Mark Kleiman put it, the equivalent of policy-driven data, instead of data-driven policy.
In other words, if people start with a particular opinion or view on a subject, any counter-evidence can create "cognitive dissonance"--discomfort caused by the presence of two irreconcilable ideas in the mind at once. One way of resolving the dissonance would be to change or alter the originally held opinion. But the researchers found that many people instead choose to change the conflicting evidence--selectively seeking out information or arguments that support their position while arguing around or ignoring any opposing evidence, even if that means using questionable or contorted logic.
That's not a news flash to anyone who's paid attention to any recent national debate--although the researchers pointed out that this finding, itself, runs counter to the idea that the reason people continue to hold positions counter to all evidence is because of misinformation or lack of access to the correct data. Even when presented with compelling, factual data from sources they trusted, many of the subjects still found ways to dismiss it. But the most interesting (or disturbing) aspect of the Northwestern study was the finding that providing additional counter-evidence, facts, or arguments actually intensified this reaction. Additional countering data, it seems, increases the cognitive dissonance, and therefore the need for subjects to alleviate that discomfort by retreating into more rigidly selective hearing and entrenched positions.
Needless to say, these findings do not bode well for anyone with hopes of changing anyone else's mind with facts or rational discussion, especially on "hot button" issues. But why do we cling so fiercely to positions when they don't even involve us directly? Why do we care who got to the North Pole first? Or whether a particular bill has provision X versus provision Y in it? Why don't we care more about simply finding out the truth--especially in cases where one "right" answer actually exists?
Part of the reason, according to Kleiman, is "the brute fact that people identify their opinions with themselves; to admit having been wrong is to have lost the argument, and (as Vince Lombardi said), every time you lose, you die a little." And, he adds, "there is no more destructive force in human affairs--not greed, not hatred--than the desire to have been right."
So, what do we do about that? If overcoming "the desire to have been right" is half as challenging as overcoming hate or greed, the outlook doesn't seem promising. But Kleiman, who specializes in crime control policy and alternative solutions to very sticky problems (his latest book is "When Brute Force Fails: How to Have Less Crime and Less Punishment"), thinks all is not lost. He points to the philosopher Karl Popper, who, he says, believed fiercely in the discipline and teaching of critical thinking, because "it allows us to offer up our opinions as a sacrifice, so that they die in our stead."
A liberal education, Kleiman says, "ought, above all, to be an education in non-attachment to one's current opinions. I would define a true intellectual as one who cares terribly about being right, and not at all about having been right." Easy to say, very hard to achieve. For all sorts of reasons. But it's worth thinking about. Even if it came at the cost of sacrificing or altering our most dearly-held opinions ... the truth might set us free.
Photo Credit: Flickr User Lanz, photolib.noaa.gov, Wikimedia Commons
The Democrat’s command and poise left her rival looking frustrated, peevish, and out of sorts.
Monday brought the first debate of the presidential season, but it often felt like two separate debates. One, from Hillary Clinton, was wonky, crisp, and polished; if not always inspiring, it was professional and careful. The other, from Donald Trump, was freewheeling, aggressive, and meandering, occasionally landing a hard blow but often substance-less and hard to follow. But the two debates intersected at times, sometimes raucously, as Trump repeatedly broke in to interrupt Clinton.
It was a commanding performance from the Democratic nominee. Clinton delivered a series of detailed answers on subjects ranging from race to the Middle East to tax policy. Meanwhile, she delivered a string of attacks on Trump, assailing him for stiffing contractors, refusing to release his tax returns, fomenting birtherism, and caricaturing black America. She stumbled only occasionally, but left few openings for Trump. She remained calm and often smiling as Trump repeatedly attacked her and interrupted her answers—doing it so often that moderator Lester Holt, often a spectral presence at the debate, finally cut in twice in short order to chide him. (Vox counted 40 instances; Clinton made some of her own interruptions, but fewer.) Clinton displayed a sort of swagger perhaps not seen since her hearing before Congress on Benghazi.
If undecided voters were looking for an excuse to come around to Clinton’s corner, they may have found it on Monday night.
Donald Trump sniffled and sucked down water. He bragged about not paying federal taxes—“That makes me smarter.” He bragged about bragging about profiting from the housing crisis—“That’s called business, by the way.” He lost his cool and maybe the race, taking bait coolly served by Hillary Clinton.
If her objective was to tweak Trump’s temper, avoid a major mistake, and calmly cloak herself in the presidency, Clinton checked all three boxes in the first 30 minutes of their first debate.
It may not matter: Trump is the candidate of change and disruption at a time when voters crave the freshly shaken. But the former secretary of state made the strongest case possible for the status quo, arguing that while voters want change in the worst way, Trump’s way would be the worst.
For decades, the candidate has willfully inflicted pain and humiliation.
Donald J. Trump has a cruel streak. He willfully causes pain and distress to others. And he repeats this public behavior so frequently that it’s fair to call it a character trait. Any single example would be off-putting but forgivable. Being shown many examples across many years should make any decent person recoil in disgust.
Judge for yourself if these examples qualify.
* * *
In national politics, harsh attacks are to be expected. I certainly don’t fault Trump for calling Hillary Clinton dishonest, or wrongheaded, or possessed of bad judgment, even if it’s a jarring departure from the glowing compliments that he used to pay her.
But even in a realm where the harshest critiques are part of the civic process, Trump crossed a line this week when he declared his intention to invite Gennifer Flowers to today’s presidential debate. What kind of man invites a husband’s former mistress to an event to taunt his wife? Trump managed to launch an attack that couldn’t be less relevant to his opponent’s qualifications or more personally cruel. His campaign and his running-mate later said that it was all a big joke. No matter. Whether in earnest or in jest, Trump showed his tendency to humiliate others.
In a unique, home-spun experiment, researchers found that centripetal force could help people pass kidney stones—before they become a serious health-care cost.
East Lansing, Michigan, becomes a ghost town during spring break. Families head south, often to the theme parks in Orlando. A week later, the Midwesterners return sunburned and bereft of disposable income, and, urological surgeon David Wartinger noticed, some also come home with fewer kidney stones.
Wartinger is a professor emeritus at Michigan State, where he has dealt for decades with the scourge of kidney stones, which affect around one in 10 people at some point in life. Most are small, and they pass through us without issue. But many linger in our kidneys and grow, sending hundreds of thousands of people to emergency rooms and costing around $3.8 billion every year in treatment and extraction. The pain of passing a larger stone is often compared to child birth.
Communal living is hardly a departure from tradition—it's a return to how humans have been making their homes for thousands of years.
For most of human history, people were hunter-gatherers. They lived in large camps, depending on one another for food, childcare, and everything else—all without walls, doors, or picket fences. In comparison, the number of people living in most households in today’s developed countries is quite small. According to the Census Bureau, fewer than three people lived in the average American household in 2010. The members of most American households can be counted on one hand, or even, increasingly, one finger: Single-person households only made up about 13 percent of all American households in 1960. Now, that figure is about 28 percent.
Belonging to a relatively small household has become the norm even though it can make daily life more difficult in many ways. Privacy may be nice, but cooking and doing chores become much less time-consuming when shared with an additional person, or even several people. Water, electric, and internet bills also become more bearable when divided among multiple residents. There are social downsides to living alone, too. Many elderly people, young professionals, stay-at-home parents, and single people routinely spend long stretches of time at home alone, no matter how lonely they may feel; more distressingly, many single parents face the catch-22 of working and paying for childcare. Living in smaller numbers can be a drain on money, time, and feelings of community, and the rise of the two-parent dual-earning household only compounds the problems of being time-poor.
Who will win the debates? Trump’s approach was an important part of his strength in the primaries. But will it work when he faces Clinton onstage?
The most famous story about modern presidential campaigning now has a quaint old-world tone. It’s about the showdown between Richard Nixon and John F. Kennedy in the first debate of their 1960 campaign, which was also the very first nationally televised general-election debate in the United States.
The story is that Kennedy looked great, which is true, and Nixon looked terrible, which is also true—and that this visual difference had an unexpected electoral effect. As Theodore H. White described it in his hugely influential book The Making of the President 1960, which has set the model for campaign coverage ever since, “sample surveys” after the debate found that people who had only heard Kennedy and Nixon talking, over the radio, thought that the debate had been a tie. But those who saw the two men on television were much more likely to think that Kennedy—handsome, tanned, non-sweaty, poised—had won.
Details later, because I start very early tomorrow morning, but: in this history of debates I’ve been watching through my conscious lifetime, this was the most one-sided slam since Al Gore took on Dan Quayle and (the very admirable, but ill-placed) Admiral James B. Stockdale (“Who am I? Why am I here?”) in the vice presidential debate of 1992.
Donald Trump rose to every little bit of bait, and fell into every trap, that Hillary Clinton set for him. And she, in stark contrast to him, made (almost) every point she could have hoped to make, and carried herself in full awareness that she was on high-def split-screen every second. He was constantly mugging, grimacing, rolling his eyes—and sniffing. She looked alternately attentive and amused.
The Donald J. Trump Foundation reportedly used $258,000, most of it other people’s money, to settle legal disputes for the Republican nominee.
For people at certain income levels, finding creative ways to avoid taxes is practically a leisure sport. Donald Trump, golf and casino magnate that he is, would never miss out on a leisure sport, would be?
In a new article, The Washington Post’s David Fahrenthold, who’s already collected a series of scoops on the Donald J. Trump Foundation, reports that Trump sometimes had people who owed him money pay his foundation instead—to the tune of at least $2.3 million. That’s legal, provided that the person who would have received the income still pays taxes on the money, which is where things get unclear. A Trump adviser initially denied that Trump had ever directed fees to his foundation, but when presented with evidence that he had a $400,000 fee for appearing on a Comedy Central roast (nice work if you can get it) sent to the foundation, the adviser said Trump had paid taxes on it. But he refused to say whether Trump had paid taxes on the rest of the $2.3 million.
During the debate, the Republican nominee seemed to confirm an accusation that he hadn’t paid any income tax, then reversed himself later.
In the absence of facts, speculation will flourish. For example, as long as Donald Trump declines to release his tax returns, his opponents will offer theories for why he has failed to do so.
Trump has claimed that he cannot release his returns because he’s being audited by the IRS. (He complained Monday that he is audited every year.) He repeated that claim during the debate, even though the IRS has said that Trump is free to release his returns even if he is being audited.
Harry Reid, the Democratic senator from Nevada who in 2012 claimed (falsely, it turned out) that Mitt Romney paid no income taxes, has speculated that Trump is not as wealthy as he claims and is a “welfare king.” Romney himself has gotten in on the act, writing on Facebook, “There is only one logical explanation for Mr. Trump's refusal to release his returns: there is a bombshell in them. Given Mr. Trump's equanimity with other flaws in his history, we can only assume it's a bombshell of unusual size.”