There is no evidence for these accusations. There are no legitimate studies supporting these contentions. There is no documentation of company officials ordering up anti-conservative bias or policies.
But to say there is no evidence for these accusations is too weak. These complaints are just false. Coming from smart people who know better—smart people like Cruz, the first U.S. presidential candidate to hire Cambridge Analytica and try to use its trove of personal Facebook data on millions of Americans—this looks like an intentionally duplicitous move.
Read: Why conservatives might be left out of the next wave of tech
Cruz knows that conservatives need Facebook and Google and that they benefit greatly from the algorithmic amplification that occurs in both systems. Trump’s 2020 campaign manager is Brad Parscale, who ran digital operations for the president’s successful 2016 campaign. Parscale declared that his mastery of Facebook for advertising, amplifying pro-Trump videos and memes, and fundraising won the 2016 election.
Scholarship supports this conclusion. As the sociologist Jen Schradie demonstrates in great detail in her new book, The Revolution That Wasn’t: How Digital Activism Favors Conservatives, Facebook and Google work better for top-down, well-funded, disciplined, directed movements. Those adjectives tend to describe conservative groups more than liberal or leftist groups in the United States. In our current media ecosystem, right-wing sources of news and propaganda spread much further and faster than liberal or neutral sources do, according to a rigorous quantitative study of communication-network patterns by Yochai Benkler, Robert Faris, and Hal Roberts at Harvard’s Berkman Klein Center for Internet & Society. Internet platforms are demonstrably not silencing conservative ideas. If anything, the opposite is true.
No algorithm is neutral. Facebook and Google are biased, but in a way that has nothing to do with American political ideologies or parties. Instead, both of these global systems favor content that generates strong emotional reactions from users—clicks, shares, likes, and comments. There is a clear commercial reason for this design choice. It keeps users hooked, ready to click on more advertisements and thus generate more revenue for the platform.
The media scholar Zeynep Tufekci has explained how YouTube’s recommendation engine sends viewers down rabbit holes of extremism because passive people like to be prodded to feel something, whether that’s anger, humor, joy, fear, or hatred. And I have described all the ways Google and Facebook move users (and users move Google and Facebook) to anger us, divide us, distract us, and undermine our ability to function as citizens of a democratic republic.
In short, Facebook is a remarkable tool for motivation. It’s a terrible platform for deliberation. Democratic citizenship demands both motivation of the like-minded and deliberation among those with different ideas and agendas. And Google is a terrible tool for discerning truth from falsity. Google is worse at discerning relevant information from trivia. It’s not a good source to achieve depth of understanding about a diverse and changing world.