Morality is “like the temple on the hill of human nature,” writes the social psychologist Jonathan Haidt. “It is our most sacred attribute.” People cherish this sacred sense of right and wrong, put it on a pedestal and surround it with spears, to defend it against attacks. The nearness and dearness of people’s morals means that conflict becomes particularly entrenched when morality gets involved—neither side wants to yield sacred ground.

It doesn’t help that most people think they are more virtuous than others. Many studies over the years have found what’s called a “better-than-average” effect—that when asked to compare themselves to the average person, most people will say they’re smarter, friendlier, more competent, etc. A recent study found that this effect is much more extreme for moral traits like honesty and trustworthiness.

“Everyone viewed themselves as though they were at the top of the scale,” says Ben Tappin, a graduate student in psychology at Royal Holloway, University of London, and an author of the study. The study went on to say that this makes people’s self-inflated morality more irrational than their bumped-up views of their intelligence, or friendliness. In the latter two realms, there was more variability—one person might think they were a little smarter than average, another might think they were a genius, another might think they were a little below average.

“Most people consider themselves paragons of virtue; yet few individuals perceive this abundance of virtue in others,” Tappin and his co-author write in the study. Perhaps, they posit, this could be because it’s evolutionarily advantageous not to trust someone you don’t know—better to assume they wouldn’t act as morally as you would, to protect yourself.

Groups have their own sense of moral superiority. “I’m quite confident that we walk around all the time with a feeling that our group is morally superior to the other group,” says Haidt, a professor of ethical leadership at New York University. “We hate them. It’s important that we show constantly how much better our side is, and anything the other side does, we will take in the worst possible light.”

Moral superiority and moral tribalism were on full display in the recent U.S. presidential election. Who someone was going to vote for was often cast as a moral decision. Donald Trump did his best to make Hillary Clinton seem like an immoral choice, repeatedly calling her a liar and “Crooked Hillary.” The Democrats called Trump out on his many lies, but also demonized the people planning to vote for him, as in Clinton’s famous dismissal of them as a “basket of deplorables.” Many times, Clinton’s message seemed to be not only “if you’re not with us, you’re against us,” but “if you’re not with us, you’re a bad person.” Michelle Obama, in a powerful speech that was as much arguing against Trump as it was arguing for Clinton, summed up the campaign: “This isn’t about politics,” she said. “It’s about basic human decency. It’s about right and wrong.”

Part of why it’s easy for anyone to see themselves, or the groups they belong to, as super moral is because morality itself is a vague concept. “You can have one person, for instance, who cares very deeply for their friends and family and would go to the ends of the earth for these people,” Tappin says. “And yet they don’t, say, give a dime to foreign charity. And then you’ve got another person who spends their entire life donating money overseas, yet in their interpersonal life, perhaps they don’t treat their family members very well. In those cases, how do you compare who’s more moral? It seems quite impossible to judge and it’s just at the mercy of people’s preferences.”

Haidt’s work identifies six different moral metrics—liberty, fairness, loyalty, authority, care, and purity. Different groups and cultures prefer to emphasize these domains to different degrees. For example, people in Eastern countries tend to emphasize purity and loyalty more than people in Western countries. People who live in countries where there has historically been higher prevalence of disease also place a higher value on purity, as well as loyalty and authority. In the United States, liberals tend to focus mostly on care, fairness, and liberty, while conservatives generally emphasize all six domains. Other research shows that people rate the moral values a group holds as the most important characteristic affecting whether they’re proud to be a member of the group, or more likely to distance themselves from it.

It’s possible for groups with different moralities to get along. Marilynn Brewer, a social psychologist and professor emeritus at Ohio State University says some groups adopt a posture of moral tolerance—a sense that “our ways are good for us, and their ways are good for them,” even if everyone privately thinks their ways are more moral. She’s observed this in research she’s done on ethnocentrism among tribal groups in East Africa.

But “as ingroups become larger and more depersonalized, the institutions, rules, and customs that maintain ingroup loyalty and cooperation take on the character of moral authority,” Brewer writes in a 1999 paper. “When the moral order is seen as absolute rather than relative, moral superiority is incompatible with tolerance for difference.”

“Different” gets coded as “immoral,” and that’s where the trouble begins. This intolerance can manifest as contempt, segregation, and avoidance. But it can also escalate, especially when it isn’t possible for groups to stay separated. “Social changes that give rise to the prospect of close contact, integration, or influence, are sufficient to kindle hatred, expulsion, and even ‘ethnic cleansing,’” Brewer writes.

Brewer also notes that it particularly behooves groups that are already advantaged or powerful to emphasize and exaggerate their supposed superiority so that they can remain in power. “Moral superiority becomes a mechanism of preserving advantage,” she says. This is true even if the feeling of moral superiority doesn’t lead to outright hostility to other groups. Preferring members of your own group means that they get the trust, generosity, and benefit of the doubt that outsiders don’t. Blatant hate isn’t always necessary. The absence of positive feeling toward a group leaves a negative space, and bigotry often rushes in to fill it.

Morality becomes a justification that fuels these broad divides between groups—political groups, religious groups, racial groups, even nations. But this also occurs on a more granular level. Humans don’t tend to carefully reason through scenarios before coming to a moral judgment about them, according to Haidt. Rather, their guts tell them something is right or wrong, and then they go back to use reason to justify that conclusion. And there’s a good amount of teamsmanship at play in that.

“Moral reasoning,” Haidt writes in his book The Righteous Mind, is “a skill we humans evolved to further our social agendas—to justify our own actions and to defend the teams we belong to.”

“During any sort of conflict, we go to battle stations,” he told me. “And the goal is not to find truth, it's to knock down everything the other side throws at you, and to try to throw things at the other side”

That makes any conflict where the two parties both feel morality is on their side almost impossible to solve. A strong moral conviction that something is right or wrong is experienced in the same way as a fact, according to research by Linda Skitka—as an objective, irrefutable truth.

“When you have a strong moral conviction about something, it really is pretty much akin to your belief that 2+2 = 4,” says Skitka, a professor of psychology at the University of Illinois at Chicago. “Can you imagine somebody being able to persuade you off of that conclusion?”

This can happen with any issue—“there are not, by definition, moral issues,” Skitka says. The debate about abortion, for example, is very divisive and may be moralized for many people, but not everyone has a strong moral conviction about abortion. But there are issues that tend to be more moralized, on average, than others, and some things only become moralized over time.

Smoking is a good example of this. Before the 1964 surgeon general’s report made the health hazards un-ignorable, whether someone smoked cigarettes or not wasn’t seen as much of a moral quandary. In the decades that followed the revelation that smoking can kill you, research shows that attitudes toward it shifted. People became more disgusted, liked smoking less, and saw smoking more and more as an immoral behavior.

For many issues, moral divides are also political divides. In the U.S., liberals and conservatives are equally morally convicted about most issues Skitka and her colleagues have looked at, including same-sex marriage, welfare, capital punishment, surveillance, social security, and taxes. (That’s not to say they feel the same way, just that they feel the same amount of moral conviction about these issues, on average.) There are a few issues that liberals feel more moral conviction about—climate change, the environment, health care, education, income inequality, and gender inequality—and a few that conservatives feel more moral conviction about—immigration, gun control, abortion, states’ rights, physician-assisted suicide, the federal deficit, and the federal budget.

How morally convicted someone feels about an issue (or a political candidate) predicts some political attitudes and behaviors. People who are more morally convicted about a cause are more likely to participate in activism for that cause. They are more likely to vote when there is a candidate they feel a strong moral tie to, or when an issue they’re morally convicted about is at stake.

What’s more, when an issue is moralized for someone, when they believe there is a right and a wrong outcome, they care more about getting the “right” outcome than how it is achieved. Authorities such as the Supreme Court are seen as less legitimate if they are out of step with someone’s moral ideology. If the system comes to the morally wrong answer, it’s taken as a sign that the system is broken.

In one study by Skitka and her colleagues, participants were put into small groups to discuss potential solutions to an assigned conflict—the death penalty, whether abortion should be legal, or whether universities should have mandatory testing as a graduation requirement. The first two topics were moral hot-buttons, while testing was a non-moral control. Some of the groups were constructed so that everyone had the same position on the topic, and some had a mix of positions. The groups who discussed a moral issue, who didn’t already agree, were the least cooperative and reported feeling the least goodwill toward each other. And those who discussed a moral issue, whether they were on the same side or not, were less likely to come to a consensus on a solution.

“People not only do not like morally objectionable policies or decisions,” Skitka writes in a review of the literature. “They do not trust democratic processes to decide these issues in the first place.” And if democratic processes and legal systems aren’t delivering the morally right answer, studies show that people are more willing to endorse the use of violence or vigilantism, if it leads to their desired outcome.

How, then, to bridge moral divides, whether they are the chasms between groups who feel morally superior to each other, or schisms over issues? The post-election United States feels like a parched landscape fissured with these divides.

“I think people have to give a certain amount of goodwill to others if they disagree,” Tappin says.

But “the trouble with perspective-taking is people have to be motivated to do it,” Brewer says. If both sides in a conflict aren’t willing to see the other’s perspective, tensions are unlikely to ease, particularly if a powerful group is disinclined to take the perspective of a marginalized group. In a recent piece reflecting on her time covering the 2016 U.S. election, the NPR reporter Asma Khalid wrote of hateful comments voters made to her because she’s a Muslim, people who shouted slurs or explained to her why they thought a Muslim ban was a good idea. “Everywhere I went, I tried to understand voters’ frustrations and empathize with their concerns,” she writes. “But the reality is — empathy isn’t always reciprocated.”

And as my colleague Vann Newkirk recently wrote: “Civility is not the highest moral imperative—especially in response to perceived injustices.”

The president-elect’s positions—denying climate change, supporting mass deportation of immigrants, appearing to consider requiring Muslims to register with the government—have many people locked in their battle stations already, fighting for their moral sacred ground. In the years to come, if supporters and opponents of these and other issues both feel morality is on their side, the research suggests there will be more conflict than solutions.

“How to de-moralize people’s political attitudes is probably one of the million dollar important questions that we need to investigate going forward,” Skitka says. “We have tried so many different things in the lab to get people to de-moralize something. It’s really, really hard.”

But once difference gets coded as “immoral,” the tension is nigh impossible to defuse. Or at least, research has yet to find a way. “I don’t know how one undoes it, once this moralizing of difference has happened.” Brewer says. “Other than, you know, a Martian invasion. Something more different comes along that makes you realize you have some similarities.”