In the infamous fundraiser video uncovered by Mother Jones, Mitt Romney unveiled some controversial thoughts on the American electorate. The main point was this: People generally fall into two camps that they will not budge from, and that only 5 to 10 percent of the electorate could be swung either way. While Romney's comments were highly criticized for their candor, there's an aspect of this thinking that is shared in the conventional wisdom: America is unchangeably polarized.
But was Romney right in the assertion that only a slim 5 to 10 percent of voters are willing to consider multiple sides?
In a paper titled "How the Polls Can Be Both Spot On and Dead Wrong," researchers in Sweden find that while polls accurately reveal what a person believes, they miss out on an equally important factor: How likely is that person to change his or her mind?
What they find tears the conventional wisdom wide open. Nearly half of the finding's participants, in a matter of minutes, became open to accepting a political belief that they had initially opposed.
Here's how it worked. The researchers gave unknowing participants a political-opinion survey to fill out. While this was happening, a researcher watched the participants' pen strokes, and filled out a survey with opposite responses. More conservative people were made liberal, more liberal people were made conservative. In a sleight of hand (see the video below), the researchers then replaced the individual's poll answers with their own bizarro versions. They gave the altered poll back to the participant for discussion.
And here's where things get interesting.
As the author's write, "An overwhelming majority of the participants accepted and endorsed a manipulated political profile that placed them in the opposite political camp." In all, a staggering 92 percent of those manipulated accepted the altered version.
It's not completely surprising that the participants were fooled like this. The phenomenon is called choice blindness, a topic this study's researchers have been experimenting with for a while. In basic terms, this is what happens: When presented with a choice we're told we've made, we'll find ways to support our "decision." For example, in a previous study, the research team asked people to choose the most attractive face from a pair of pictures. Later, they convinced the participants that they had chosen the picture that they originally did not prefer. But more than that, they got the participants to explain why they chose the picture they didn't actually choose.
That's all good and interesting for cognitive science. But here's the conclusion that is most important for the world of politics: People are more amenable to the other side of the political spectrum than conventional wisdom says.
After discussing their altered answers with the researcher, the participants were again asked to indicate how they would likely vote. "[What] we found was that no less than 48 percent of [participants] were being open for movement across the great partisan divide," the study concluded. That's a lot more than the 5 to 10 percent of voters Mitt Romney thought he could sway. Even people who came into the task saying they had strong beliefs were fooled by the trick, and there was "no relationship between level of corrections and self-rated political engagement or certainty. That is, participants who rated themselves as politically engaged, or certain in their political convictions, were just as likely to fail to notice a manipulation."
Petter Johansson, one of the study's authors, explains that just prompting (albeit through deception) someone to consider an alternative position makes their thinking on an issue much more nuanced.
"It's like they enter the conversation with themselves," Johansson said in a phone interview. "And then they construct the argument supporting the opposite decision. Then, they convince themselves that they actually hold a different position. But we don't say anything in this to push them or prompt them." No leading questions — and no in-your-face political ads — are necessary.
"A wrong conclusion would be to think that people are stupid," Johansson says. "It's rather this shows the capacity" to change one's mind. It's not that the deception reveals that people have no grounding in their beliefs, it's rather that the deception forces us to consider another side.
The researchers did debrief the participants in the experiment, so it's unclear whether the effect would have been the same in actual polls. However, "if you publically argue for a position and you believe you hold it, it's likely to feed back into a decision," Johansson says.
But Johansson doesn't want this research to be used to manipulate others. The biggest take-home point is this: "There is a larger flexibility in the electorate than [we] often assume," he said.
This article is from the archive of our partner National Journal.
We want to hear what you think about this article. Submit a letter to the editor or write to email@example.com.