Facebook’s greatest strength—its ability to identify and connect like-minded people—is also a major vulnerability. Over the past month, the company has revealed that Russia-linked accounts purchased thousands of fake political ads on its platform around the 2016 U.S. election. These ads “microtargeted” Americans based on their divisions along political, racial, and religious lines. Some, as CNN recently reported, specifically targeted voters in Michigan and Wisconsin, two of the most heavily contested states.
The apparent goal was to sow distrust among voters, perhaps even shape how they voted.
As an initial response, Facebook announced that it will close the loopholes that allow Russian-backed sources—or any other foreign powers—to open fake accounts. While a productive start, this doesn’t go after the underlying problem that Russian operatives capitalized on: the extreme polarization of Americans on political issues. Wittingly or not, Facebook has taken on a central role in American democracy. Now the company has to decide how proactive it wants to be to become “a force for good,” as Mark Zuckerberg has promised.
One step Facebook could take in this direction: reverse-engineer the very algorithms used by the Russians. Facebook could try an experiment of matching Americans across political lines to help bridge the country’s deep divide.
Key to understanding why the Russian operatives’ efforts worked is looking at the way in which people build social networks online and the value they get from them. In Bowling Alone, the Harvard professor Robert Putnam uses the phrase “social capital” to describe this process, which he explains happens in two ways: “Bonding” is social capital built by connecting within exclusive homogenous groups; “bridging” is social capital built by connecting with inclusive heterogeneous groups. Both are valuable—while bonding offers support and solidarity, bridging helps people expand their perspectives and creates trust across diverse groups.
“Bonding social capital constitutes a kind of sociological superglue,” Putnam writes, “whereas bridging social capital provides a sociological WD-40.”
Facebook is primarily a mechanism for bonding, not bridging. Studies show that in the vast majority of cases, people live in self-made echo chambers on Facebook that reinforce their existing views of the world. You need look no further than the “red feeds” and “blue feeds” on any given issue to see that in general, when people connect on Facebook, they are mostly connecting with others who have similar political beliefs, educational backgrounds, and religious outlooks.
Although bridging is possible—say, when your old high-school friend who stayed local while you flew across the country for college offers to connect with you—the ability to choose your network and “hide,” “unfriend,” or even “block” people with whom you no longer want to engage makes it essentially an exclusive network. Facebook further amplifies this segregation by using data from a user’s social network and activities on the platform to custom-tailor a News Feed that aggregates posts it knows that user wants to see, often reinforcing worldviews. This insularity allowed Russia’s $100,000 investment in “dark ads” to reach roughly 10 million Americans before and after the election in discrete demographic and geographic circles.
Facebook’s emphasis on bonding over bridging also has consequences for how people build trust. The relationship researcher John Gottman has found that successful romantic relationships depend on making frequent deposits in each partner’s “emotional bank account.” Consistent positive interactions increase levels of trust in the relationship, so that when conflict arises, there are enough “reserves” in place to make a withdrawal, but still leave the relationship in a net-positive place. In fact, Gottman estimates that every relationship needs at least five positive interactions to maintain equilibrium with a single negative interaction.
Applying Gottman’s “bank account” model to social relationships can help explain why it’s difficult to have meaningful disagreements on political issues. Americans today spend an average of six-and-a-half hours each day online, with almost a third of that time on social media. If their social-media diets include relatively insular circles like Facebook, their daily positive interactions are likely occurring more with people they already agree with, and less with people from across groups with different perspectives. In fact, in 2017, odds are that Americans will most likely interact with someone who holds different political views when they’re screaming at them from the other side of a protest line, or inside an angry internet forum.
Without a way to make regular, positive deposits in social relationships that bridge political lines, every civic debate is a withdrawal without social reserves, leaving people perpetually overdrawn.
Some research supports the idea that frequent and meaningful interactions between diverse Facebook users can promote the flow of new ideas across otherwise unconnected groups. Jonny Thaw, a spokesperson for the company, pointed out a 2014 study that looked at how the platform creates the “bridging social capital” described by Putnam. The study, which was conducted by researchers unaffiliated with Facebook, found that “weaker ties” in someone’s network (like a friend of a friend, or someone with whom you would not have other offline connections) offered the platform’s users the most potential for users to expand their worldview, because these connections opened the door to new information and diverse perspectives.
More importantly, however, was that the users who benefited the most from their weak social ties—in terms of expanding their outlook—were those who actively engaged in what the study’s authors call “Facebook Relationship Maintenance Behaviors,” like “responding to questions, congratulating or sympathizing with others, and noting the passing of a meaningful day.”
In other words, simply being connected to Facebook users from different backgrounds isn’t enough to make people open to new perspectives and ideas; users need to actively make deposits in each other’s social bank accounts in order to truly benefit from those diverse connections. The study notes that facilitating bridging among its users “may lie in technical features of the site that lower the cost of maintaining and communicating with a larger network of weak ties.”
This study points to some creative ways that Facebook can promote political bridging among its users—and develop some WD-40 against threats to democracy in the process. Let’s say that Facebook created a new feature called “Friend Swap” for users interested in creating connections with people outside of their political bubble. The company could use its powerful algorithms to match users with someone who, based on their individual preferences and posts, they disagree with politically, but have some things in common with personally. What’s important is that the users don’t engage over political issues, at least until they’ve had time to build some social trust. If you’re a liberal, you might not be so open to being thrown whole-hog into a conservative stranger’s feed and reading their posts from Fox News. But you may find some common ground around, say, rooting for the same sports team, or shared musical tastes or experiences, like being a veteran.
A feature like Friend Swap would selectively share only the posts of each user’s feed in an area they have in common with their political counterpart, and allow them to interact on that topic. After a trial period, the “swapped” posts might include ones on another common interest, and so on, until the users elect, if they eventually choose, to actually be “friends.” By creating connections around common interests or experiences, users would make deposits in each other’s social bank accounts over time. If they do become full-on friends, they would be more likely, at least in theory, to be open to a dialogue on differing viewpoints on political issues from someone they’ve come to trust based on bonding in other areas. Hopefully, at the very least, they could agree to disagree while maintaining their connection, which is still a win in today’s climate.
Of course, Friend Swap won’t be a panacea for political differences. It requires people to view online relationships—with strangers—as being valuable enough to invest their time. And the self-selection of people who opt in to this kind of feature might be the same people who would be more open to different viewpoints anyway. But even as an experiment, Friend Swap would be an opportunity for Facebook to gather data on how it can bridge its red and blue silos over shared values like civility, openness, tolerance, and respect. It would also offer a new way to connect people from politically polarized geographic regions, like the Rust Belt and the coasts.
Trying to “socially engineer” relationships, even for the purposes of political cross-pollination, might go against the grain of a company that has been built upon a principle of fierce neutrality. But Russian operatives’ attempts to use Facebook to disrupt American democracy demonstrates that neutrality no longer seems be an option, if it ever really was one in the first place. On Yom Kippur, the Jewish day of atonement, Zuckerberg acknowledged as much, asking for forgiveness for “the ways [Facebook] was used to divide people rather than bring us together.” Facebook has the talent and the resources to help unite people in defense of democratic values, if it has the will to do it.
We want to hear what you think about this article. Submit a letter to the editor or write to firstname.lastname@example.org.