Inside the Political Brain
How we're built for bias in public life -- and why that makes it so hard to fix our irrational political structures.
Why is the American political system so irrational? Why is it that, even though a lot less partisanship and a lot more compromise would be good for the country, nobody can seem to get us there?
The good news is that science is starting to figure this out. The bad news is that it seems to be fundamentally rooted in who we are -- creatures who can detect bad and emotional reasoning when others are guilty of doing it, but not so much when we're doing it ourselves.
Of late, researchers in political science and psychology have been talking a great deal about an idea called "motivated reasoning" -- thought and argument that seems rational and dispassionate, but really isn't anything of the sort. Motivated reasoning is becoming a buzzword, but few have sketched out why it is such a powerful idea: Because it fits so nicely with everything we now know about evolution and the human brain.
Evolution built the human brain -- but not all at once. The brain has been described as a "confederation of systems" with different evolutionary ages and purposes. As a result of this tinkering on a geologic timescale, we find ourselves with an evolutionarily older brain lying beneath and enveloped by a newer brain, both bound together and acting in coordination.
The older parts -- the subcortex, the limbic regions -- tend to be involved in emotional or automatic responses. These are stark and binary reactions -- not discerning or discriminating. And they occur extremely rapidly, much more so than our conscious thoughts. The newer parts of the brain, such as the prefrontal cortex, empower abstract reasoning, language, and more conscious and goal-directed behavior. In general, these operations are slower and only able to focus on a few things or ideas at once. Their bandwidth is limited.
Thus, while the newer parts of the brain may be responsible for our species' greatest innovations and insights, it isn't like they always get to run the show. "There are certain important circumstances where natural selection basically didn't trust us to make the right choice," explains Aaron Sell, an evolutionary psychologist at Griffith University in Australia. "We have a highly experimental frontal lobe that plays around with ideas, but there are circumstances, like danger, where we're not allowed to do that." Instead, the rapid-fire emotions take control and run an automatic response program -- e.g., fight or flight.
How does this set the stage for motivated or emotional reasoning?
Mirroring this evolutionary account, psychologists have been talking about the "primacy of affect" -- emotions preceding, and often trumping, our conscious thoughts -- for three decades. Today they broadly break the brain's actions into the operations of "system 1" and "system 2," which are roughly analogous to the emotional and the reasoning brain. System 1, the older system, governs our rapid fire emotions; System 2 refers to our slower moving, thoughtful, and conscious processing of information. Its operations, however, aren't necessarily free of emotion or bias. Quite the contrary: System 1 can drive system 2. Before you're even aware you're reasoning, your emotions may have set you on a course of thinking that's highly skewed, especially on topics you care a great deal about. (Freud called this the subconscious.)
How do system 1's biases infiltrate system 2? The mechanism is thought to be memory retrieval -- in other words, the thoughts, images, and arguments called into one's conscious mind following a rapid emotional reaction. Memory, as embodied in the brain, is conceived of as a network, made up of nodes and linkages between them -- and what occurs after an emotional reaction is called spreading activation. As you begin to call a subject to mind (like Sarah Palin) from your long-term memory, nodes associated with that subject ("woman," "Republican," "Bristol," "death panels," "Paul Revere") are activated in a fanlike pattern -- like a fire that races across a landscape but only burns a small fraction of the trees. And subconscious and automatic emotion starts the burn. It therefore determines what the conscious mind has available to work with -- to argue with.
And then when we proceed to argue, and we use emotions to reinforce our beliefs -- making us even more ready to argue to defend our beliefs the next time around. Our reasoning skills are not objective, but driven by motivations and recollections we can barely discern consciously.
Since it is fundamentally rooted in our brains, it should come as no surprise that this kind motivated reasoning emerges when we're quite young. As children develop into adolescents, motivated reasoning also develops. In fact, one of the experiments showing this is memorable enough that it is worth describing in some detail.
Psychologist Paul Klaczynski of the University of Northern Colorado wanted to learn how well adolescents are capable of reasoning on topics they care deeply about. So he decided to see how they evaluated arguments about whether a kind of music they liked (either heavy metal or country) led people to engage in harmful or antisocial behavior (drug abuse, suicide, etc.). You might call it the Tipper Gore versus Frank Zappa experiment, recalling the 1980s debate over whether rock lyrics were corrupting kids and whether some albums needed to have parental labels on them.
Ninth and twelfth graders were presented with arguments about the behavioral consequences of listening to heavy metal or country music -- each of which contained a classic logical fallacy, such as a hasty generalization or tu quoque (a diversion). The students were then asked how valid the arguments were, to discuss their strengths and weaknesses, and to describe how they might design experiments or tests to falsify the arguments they had heard.
Sure enough, the students were found to reason in a more biased way to defend the kind of music they liked. Country fans rated pro-country arguments as stronger than anti-country arguments (though all the arguments contained fallacies), flagged more problems or fallacies in anti-country arguments than in pro-country ones, and proposed better evidence-based tests of anti-country arguments than for the arguments that stroked their egos. Heavy metal fans did the same.
Consider, for example, one adolescent country fan's response when asked how to disprove the self-serving view that listening to country music leads one to have better social skills. Instead of proposing a proper test (for example, examining antisocial behavior in country music listeners) the student instead relied on what Klaczynski called "pseudo-evidence" -- making up a circuitous rationale so as to preserve a prior belief:
As I see it, country music has, like, themes to it about how to treat your neighbor. So, if you found someone who was listening to country, but that wasn't a very nice person, I'd think you'd want to look at something else going on in his life. Like, what's his parents like? You know, when you've got parents who treat you poorly or who don't give you any respect, this happens a lot when you're a teenager, then you're not going to be a model citizen yourself.
Clearly this is no test of the argument that country music listening improves your social skills. "Adolescents protect their theories with a diverse battery of cognitive defenses designed to repel attacks on their positions," wrote Klaczynski. And their reasoning did not get better from ninth grade to twelfth grade. At both ages, students saw the flaws in the views of others, but not in their own.
The theory of motivated reasoning does not, in and of itself, explain why we might be driven to interpret information in a biased way, so as to protect and defend our preexisting convictions. And, inevitably, humans will have a great variety of motivations, ranging from passionate love to financial greed.
What's more, the motivations needn't be purely selfish. Even though motivated reasoning is sometimes also referred to as "identity-protective cognition," we don't engage in this process to defend ourselves alone. Our identities are bound up with our social relationships and affiliations -- with our families, communities, alma maters, teams, churches, political parties. Our groups. In this context, an attack on one's group, or on some view with which the group is associated, can effectively operate like an attack on the self.
That's where politics comes in. Our political, ideological, partisan, and religious convictions -- because they are deeply held enough to comprise core parts of our personal identities, and because they link us to the groups that bulwark those identities and give us meaning -- can be key drivers of motivated reasoning. They can make us virtually impervious to facts, logic, and reason. Anyone in a politically split family who has tried to argue with her mother, or father, about politics or religion -- and eventually decided "that's a subject we just don't talk about" -- knows what this is like, and how painful it can be.
And no wonder. If we have strong emotional convictions about something, then these convictions must be thought of as an actual physical part of our brains, residing not in any individual brain cell (or neuron) but rather in the complex connections between them, and the pattern of neural activation that has occurred so many times before, and will occur again. The more we activate a particular series of connections, the more powerful it becomes. It grows more and more a part of us, like the ability to play guitar or juggle a soccer ball.
So to attack that "belief" through logical or reasoned argument, and thereby expect it to vanish and cease to exist in a brain, is really a rather naïve idea. Certainly, it is not the wisest or most effective way of trying to "change brains," as Berkeley cognitive linguist George Lakoff puts it.
We've inherited an Enlightenment tradition of thinking of beliefs as if they're somehow disembodied, suspended above us in the ether, and all you have to do is float up the right bit of correct information and wrong beliefs will dispel, like bursting a soap bubble.
Beliefs are physical. To attack them is like attacking a part of a person's anatomy.
Nothing could be further from the truth. Beliefs are physical. To attack them is like attacking one part of a person's anatomy, almost like pricking his or her skin (or worse). And motivated reasoning might perhaps best be thought of as a defensive mechanism that is triggered by a direct attack upon a belief system, physically embodied in a brain.
What's most disorienting is how elaborate motivated reasoning can become, especially among those who are knowledgeable or sophisticated. If we like to pretend that politics is rational and based on reason, it's in part because we can't believe that a think tank scholar with a Ph.D. could actually be arguing from emotions as he or she rattles of facts and statistics, or composes an entire book.
However, we should be suspicious of seeming brilliance most of all. Scientists are also starting to home in on a way of explaining the elaborate heights of our capacity for rationalization -- our argumentative creativity -- and just how floridly idiotic we can be. We're not only capable of being wrong; we make quite the show of it.
One team of thinkers -- philosopher Hugo Mercier of the University of Pennsylvania and cognitive scientist Dan Sperber of the Jean Nicod Institute in France -- have recently proposed that we've been reasoning about reasoning all wrong, trying to fix what didn't need fixing, if we'd only understood what its original purpose was. Contrary to the claims of Enlightenment idealists, Mercier and Sperger suggest human reason did not evolve as a device for getting at the objective truth. Rather, they suggest that its purpose is to facilitate selective arguing in defense of one's position in a social context -- something that, we can hardly dispute, we are very good at.
When thought about in the context of the evolution of human language and communication, and cooperation in groups, this makes a lot of sense. There would surely have been a survival value to getting other people in your hunter-gatherer group to listen to you and do what you want them to do -- in short, a value to being persuasive. And for the listeners, there would have been just as much a premium on being able to determine whether a given speaker is reliable and trustworthy, and should be heeded. Thus, everybody in the group would have benefited from an airing of different views, so that their strengths and weaknesses could be debated -- regarding, say, where it would be a good place to hunt today or whether the seasons are changing.
Considered in this light, reasoning wouldn't be expected to make us good logicians, but rather, good rhetoricians. And that's what we are. Not only are we very good at selectively cobbling together evidence to support our own case -- aided by motivated reasoning -- but we're also good at seeing the flaws in the arguments of others when they get up on top of the soap box, good at slicing and dicing their claims.
When lots of individuals blow holes in one another's claims and arguments, the reasoning of the group should be better than the reasoning of the individual. But at the same time, the individual -- or the individual in a self-affirming group that does not provide adequate challenges -- is capable of going very wrong, because of motivated reasoning and confirmation bias. Thanks to these flaws, the sole reasoner rarely sees what's wrong with his or her logic. Rather, the sole reasoner becomes the equivalent of a crazy hermit in the wilderness -- or, to quote the late Frank Zappa, the author of "that tacky little pamphlet in your Daddy's bottom drawer." And the unchallenged group member becomes like a cult follower.
Mercier's and Sperber's "argumentative theory of reason" provides a strong case for supporting group reasoning processes like the scientific one, which are built around challenges to any one individual's beliefs or convictions. These processes may be the only reliable check on our going vastly astray. By the same token, the theory also suggests that if you insulate yourself from belief challenge, you are leaving yourself vulnerable to the worst flaws of reasoning, without deriving any of the benefits of it.
Knowing all of this, now look back at American politics again. The parties are like teams or tribes, with which we strongly and emotionally affiliate. And our information sources too often provide us with comfort and affirmation, rather than challenges. The game is perfectly set up for emotion and bias, with fewer and fewer checks in place to stop it.
We're still not doomed to reason poorly all the time -- to succumb to political emotions. But all the odds are stacked in favor of this outcome. It doesn't seem too bold to suggest that American politics would be a little bit healthier if we could all admit what is really going on.
This article is an adapted excerpt from The Republican Brain: The Science of Why They Deny Science and Reality (Wiley).