Avoiding Cognitive Dissonance

Conor Friedersdorf

Joe Keohane:

Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It's this: Facts don't necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

This bodes ill for a democracy, because most voters -- the people making decisions about how the country runs -- aren't blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

"The general idea is that it's absolutely threatening to admit you're wrong," says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon -- known as "backfire" -- is "a natural defense mechanism to avoid that cognitive dissonance."

These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions.
Read the rest here.


Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus
Idea of the Day Show Us What We Spend on Power

Show Us What We Spend on Power

Electricity bills are confusing, and don't arrive until long after the damage is done. The fix to a system that's high in both costs and headaches lies in connecting consumers to their consumption--show people what they're using in real time, and make it easy to compare costs to kilowatts. Geoffrey Gagnon

Features from the Magazine: Stories from our
The End of Men

The End of Men

The sexes: Women are dominating society as never before. By Hanna Rosin Plus: Are Fathers Necessary? By Pamela Paul

Xanadu

Xanadu

Energy: A map of one couple's attempt to build the world's greenest home By Joshua Green

Closing the Digital Frontier

Closing the Digital Frontier

Technology: How media companies are taming the Internet's chaos By Michael Hirschorn

The Littlest Schoolhouse

The Littlest Schoolhouse

Education: Helping wayward students—by personalizing curricula By Ta-Nehisi Coates

No Refills

No Refills

Business: Why are fewer drugs being approved, even as R&D surges? By Megan McArdle

The Politically Incorrect Guide to Ending Poverty

The Politically Incorrect Guide to Ending Poverty

Wealth of nations: An eminent economist discovers the virtues of colonialism. By Sebastian Mallaby

The Case for Calling Them Nitwits

The Case for Calling Them Nitwits

Security: Most terrorists are bungling fools. Spread the word. By Daniel Byman and Christine Fair