New research from London suggests we have different brain structures based on our political leanings
Last week's Congressional brinksmanship over the budget illustrated, once again, just how polarized the different camps in Congress have become. Granted, some amount of the distance between the public stances legislators took can be explained by a combination of maneuvering for votes back home and posturing for political gain in the constant power struggle that is Washington. But still. Watching the two sides argue, it was clear that they didn't just differ on details. There are entirely different worldviews behind each camp's budget proposals ... different enough that one might wonder if they're really all experiencing the same reality.
Well, according to neuroscientists in Britain ... they might not be.
In a report published last Thursday, neuroscience researchers from the Institute of Cognitive Neuroscience at University College London announced that they had found evidence that liberals and conservatives actually have different brain structures.
Cognitive neuroscientist Dr. Ryota Kanai and colleagues conducted MRI scans of 118 college students whose self-reported political views ranged from "very liberal" to "very conservative." Many areas of the subjects' brains showed no difference based on political orientation. But the subjects classifying themselves as "liberal" had a higher volume of gray matter in the anterior cingulate cortex of their brains than study participants who classified themselves as "conservative." The anterior cingulate cortex is believed to play a role in helping people cope with and sort through uncertainty and conflicting information, as well as affecting their levels of emotional awareness and empathy. The "conservative" participants, on the other hand, had a higher volume of gray matter in the right amygdala region -- which is thought to play a big role in identifying and responding to threats.
The brain is incredibly complex, of course, and we are still only in the baby stages of understanding how and why it works the way it does. But in theory, someone with a larger amygdala would very likely be quicker to see threats and feel fear, whereas someone with a smaller amygdala but larger anterior cingulate cortex, given the same stimuli, would be more likely to consider other possibilities or explanations for that stimuli. The "larger anterior cingulate cortex" group would also be more likely to look at people the first group saw as threatening and see, instead, people in need of a helping hand.
This is not the first time researchers have looked for physiological or psychological underpinnings for our political viewpoints or worldviews. In his 2009 Atlanticarticle about the longitudinal Grant Study that followed 268 Harvard students throughout their lives, Joshua Wolf Shenk reported that "personality traits assigned by the psychiatrists in the initial interviews largely predicted who would become Democrats (descriptions included 'sensitive,' 'cultural,' and 'introspective') and Republicans ('pragmatic' and 'organized)."
Indeed, Kanai said the MRI research was sparked by other recent psychological studies that found correlations between participants' functional behavior (accurately sorting through conflicting information, recognizing threats) and their stated political beliefs. In the MRI-based study, Kanai said, "We show that this functional correlate of political attitudes has a counterpart in brain structure."
But what does that mean? Are we hard-wired to disagree with each other from birth, because our brains process data from the world in fundamentally different ways? That question remains to be answered. It's possible that brain structure is set early, but it's also possible that it's influenced by experiences and environment. Kanai and his colleagues note in the report that other research efforts have already shown that brain structure "can exhibit systematic relationships with an individual's experiences and skills," and "can change after extensive training." And people certainly have been known to change their worldviews as they get older.
Clearly, Kanai and his colleagues are just scratching the surface of a very complex subject. But their research does raise some interesting questions. If experience does, in fact, influence brain structure, could a person exposed to high levels of legitimate threats over time develop a larger right amygdala to better respond to them? In other words, if you took someone who was a professed liberal and sent them to the front lines in Afghanistan for three years, would they return with a larger right amygdala, developed from an urgent need to identify and respond to threats every day? And along with that change in brain structure, would their political views shift to the right, as well? And what about children raised in a war zone? Do a great number of them end up with large right amygdalas? And, in turn, does that make them more likely to see the world in terms of threats and more absolute answers, with less tolerance for conflicting explanations or information, and less ability to feel empathy? If so, it might go a long way to explaining some of the entrenched positions in, say, the Israeli-Palestinian conflict.
Of course, that still doesn't explain people who've lived fairly secure lives but still see the world in terms of threats to be defended against, or people who've grown up in the middle of chaos and conflict and become peacemakers, overflowing with empathy and tolerance of conflicting complexity, even to a fault. What's more, few of us in mid-life see the world in as absolutely black-and-white clear terms as we did when we were 20. So another interesting follow-up would be to do a longitudinal study of brain structure over people's lifetimes, to see how those areas change. In fact, Kanai and his colleagues say as much in their report. "It requires a longitudinal study," the researchers conclude, "to determine whether the changes in brain structure that we observed lead to changes in political behavior or whether political attitudes and behavior instead result in changes of brain structure."
In any event, the University College study provides some biological proof for an important point: namely, that all of us see the world through lenses. None of us has a completely objective view of reality or truth -- a point that all of us would do well to remember. Imagine, for example, the difference in tone the debates in Congress might have if every legislator began by saying, "I recognize that I may view the same data differently than my colleagues because of the particular lenses or biases I have. But this is what I believe..."
Would it make a difference in the outcome? Possibly not. But somewhere in the recognition that our take on any given situation is not the only view, or the "right" or "obvious" or "logical" or "objective" view, but only our point of view ... lie the seeds for a more open, civil, and productive discussion.
But then, of course, that's just my point of view.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Places like St. Louis and New York City were once similarly prosperous. Then, 30 years ago, the United States turned its back on the policies that had been encouraging parity.
Despite all the attention focused these days on the fortunes of the “1 percent,” debates over inequality still tend to ignore one of its most politically destabilizing and economically destructive forms. This is the growing, and historically unprecedented, economic divide that has emerged in recent decades among the different regions of the United States.
Until the early 1980s, a long-running feature of American history was the gradual convergence of income across regions. The trend goes back to at least the 1840s, but grew particularly strong during the middle decades of the 20th century. This was, in part, a result of the South catching up with the North in its economic development. As late as 1940, per-capita income in Mississippi, for example, was still less than one-quarter that of Connecticut. Over the next 40 years, Mississippians saw their incomes rise much faster than did residents of Connecticut, until by 1980 the gap in income had shrunk to 58 percent.
As the public’s fear and loathing surge, the frontrunner’s durable candidacy has taken a dark turn.
MYRTLE BEACH, South Carolina—All politicians, if they are any good at their craft, know the truth about human nature.
Donald Trump is very good, and he knows it better than most.
Trump stands alone on a long platform, surrounded by a rapturous throng. Below and behind him—sitting on bleachers and standing on the floor—they fill this city’s cavernous, yellow-beige convention center by the thousands. As Trump will shortly point out, there are a lot of other Republican presidential candidates, but none of them get crowds anything like this.
Trump raises an orange-pink hand like a waiter holding a tray. “They are not coming in from Syria,” he says. “We’re sending them back!” The crowd surges, whistles, cheers. “So many bad things are happening—they have sections of Paris where the police are afraid to go,” he continues. “Look at Belgium, the whole place is closed down! We can’t let it happen here, folks.”
Live in anticipation, gathering stories and memories. New research builds on the vogue mantra of behavioral economics.
Forty-seven percent of the time, the average mind is wandering. It wanders about a third of the time while a person is reading, talking with other people, or taking care of children. It wanders 10 percent of the time, even, during sex. And that wandering, according to psychologist Matthew Killingsworth, is not good for well-being. A mind belongs in one place. During his training at Harvard, Killingsworth compiled those numbers and built a scientific case for every cliché about living in the moment. In a 2010 Science paper co-authored with psychology professor Daniel Gilbert, the two wrote that "a wandering mind is an unhappy mind."
For Killingsworth, happiness is in the content of moment-to-moment experiences. Nothing material is intrinsically valuable, except in whatever promise of happiness it carries. Satisfaction in owning a thing does not have to come during the moment it's acquired, of course. It can come as anticipation or nostalgic longing. Overall, though, the achievement of the human brain to contemplate events past and future at great, tedious length has, these psychologists believe, come at the expense of happiness. Minds tend to wander to dark, not whimsical, places. Unless that mind has something exciting to anticipate or sweet to remember.
Why are so many kids with bright prospects killing themselves in Palo Alto?
The air shrieks, and life stops. First, from far away, comes a high whine like angry insects swarming, and then a trampling, like a herd moving through. The kids on their bikes who pass by the Caltrain crossing are eager to get home from school, but they know the drill. Brake. Wait for the train to pass. Five cars, double-decker, tearing past at 50 miles an hour. Too fast to see the faces of the Silicon Valley commuters on board, only a long silver thing with black teeth. A Caltrain coming into a station slows, invites you in. But a Caltrain at a crossing registers more like an ambulance, warning you fiercely out of its way.
The kids wait until the passing train forces a gust you can feel on your skin. The alarms ring and the red lights flash for a few seconds more, just in case. Then the gate lifts up, signaling that it’s safe to cross. All at once life revives: a rush of bikes, skateboards, helmets, backpacks, basketball shorts, boisterous conversation. “Ew, how old is that gum?” “The quiz is next week, dipshit.” On the road, a minivan makes a left a little too fast—nothing ominous, just a mom late for pickup. The air is again still, like it usually is in spring in Palo Alto. A woodpecker does its work nearby. A bee goes in search of jasmine, stinging no one.
A Chicago cop now faces murder charges—but will anyone hold his colleagues, his superiors, and elected officials accountable for their failures?
Thanks to clear video evidence, Chicago police officer Jason Van Dyke was charged this week with first-degree murder for shooting 17-year-old Laquan McDonald. Nevertheless, thousands of people took to the city’s streets on Friday in protest. And that is as it should be.
The needlessness of the killing is clear and unambiguous:
Yet that dash-cam footage was suppressed for more than a year by authorities citing an investigation. “There was no mystery, no dead-end leads to pursue, no ambiguity about who fired the shots,” Eric Zorn wrote in The Chicago Tribune. “Who was pursuing justice and the truth? What were they doing? Who were they talking to? With whom were they meeting? What were they trying to figure out for 400 days?”
Better-informed consumers are ditching the bowls of sugar that were once a triumph of 20th-century marketing.
Last year, General Mills launched a new product aimed at health-conscious customers: Cheerios Protein, a version of its popular cereal made with whole-grain oats and lentils. Early reviews were favorable. The cereal, Huffington Post reported, tasted mostly like regular Cheerios, although “it seemed like they were sweetened and flavored a little more aggressively.” Meanwhile, ads boasted that the cereal would offer “long-lasting energy” as opposed to a sugar crash.
But earlier this month, the Center for Science in the Public Interest sued General Mills, saying that there’s very little extra protein in Cheerios Protein compared to the original brand and an awful lot more sugar—17 times as much, in fact. So why would General Mills try to market a product as containing protein when it’s really a box fill of carbs and refined sugar?
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
Bill Gates has committed his fortune to moving the world beyond fossil fuels and mitigating climate change.
In his offices overlooking Lake Washington, just east of Seattle, Bill Gates grabbed a legal pad recently and began covering it in his left-handed scrawl. He scribbled arrows by each margin of the pad, both pointing inward. The arrow near the left margin, he said, represented how governments worldwide could stimulate ingenuity to combat climate change by dramatically increasing spending on research and development. “The push is the R&D,” he said, before indicating the arrow on the right. “The pull is the carbon tax.” Between the arrows he sketched boxes to represent areas, such as deployment of new technology, where, he argued, private investors should foot the bill. He has pledged to commit $2 billion himself.
It was widely seen as a counter-argument to claims that poor people are "to blame" for bad decisions and a rebuke to policies that withhold money from the poorest families unless they behave in a certain way. After all, if being poor leads to bad decision-making (as opposed to the other way around), then giving cash should alleviate the cognitive burdens of poverty, all on its own.
Sometimes, science doesn't stick without a proper anecdote, and "Why I Make Terrible Decisions," a comment published on Gawker's Kinja platform by a person in poverty, is a devastating illustration of the Science study. I've bolded what I found the most moving, insightful portions, but it's a moving and insightful testimony all the way through.