It’s November 6, 2016. The world is not in good shape.

After years of historic lows, oil prices have rebounded—in fact, they have rebounded too well. Gas is now fast approaching $4 per gallon. High energy costs have kicked the Chinese economy into a depression, and the United States begins hemorrhaging workers. With fear spreading, the South China Sea is getting testier. What’s more, it’s been a terrible tropical-cyclone season, and southern cities are ailing. Miami and its suburbs, specifically, might take a decade to recover from Hurricane Paula.

Amid this unease, some moderate, middle-aged white voters have started taking renewed interest in Donald Trump, the Republican candidate for president. To them, his once-ludicrous rhetoric is sounding more and more accurate. Their support still wouldn’t give him the popular vote, but it might let him take Ohio, Florida, and the electoral college.

With the election two days away, younger and urban Americans are terrified. Some are arranging ways for their Muslim friends to leave the country. That’s the atmosphere in which two senior Facebook engineers approach Mark Zuckerberg, the company’s CEO, and tell him that this whole mess can be stopped right now.

Could this happen? Would Facebook be able to single-handedly stop Donald Trump—or any other presidential candidate? It’s a question that some at Facebook appear to be asking.

* * *

At the end of every week, Zuckerberg holds an internal question-and-answer session for employees. Usually before these sessions, the company circulates a poll internally asking what concerns he should address. On March 4, as one of these polls circulated among workers, many employees voted to ask him: “What responsibility does Facebook have to help prevent President Trump in 2017?”

This survey’s existence was first reported by Gizmodo. Facebook hasn’t yet commented on whether Mark addressed this question or what he said.

“Voting is a core value of democracy and we believe that supporting civic participation is an important contribution we can make to the community,” said a Facebook spokesman in response to the report. “We as a company are neutral—we have not and will not use our products in a way that attempts to influence how people vote.”

The world’s largest social network says it won’t avert a Trump presidency—but could it? In its story on the survey question, Gizmodo hypothesizes one way that the company could step in. By gradually wiping pro-Trump stories from its feed, Facebook could suffocate a campaign that has run on free media attention.

“Facebook wouldn’t have to disclose it was doing this, and would be protected by the First Amendment,” writes Michael Nunez, a Gizmodo editor.

It makes sense as a scenario, and it would be hard to track. Over the past two years, journalists have discovered the incredible power wielded by Facebook’s News Feed. The feature can divert massive amounts of money and attention to news sites. Detecting changes in how News Feed works is notoriously hard: In today’s New York Times, web publishers fret that they are rarely sure whether drops in traffic from News Feed are felt across the industry or only happening to them.

But there is an easier way that Facebook (or a few rogue engineers) could change American history, and it would be even trickier to verify. Since 2008, Facebook has displayed an “I Voted!” button like this on every major election day:

If you tell Facebook you voted, your name and picture appear near the button when other friends view it. Facebook encourages your friends to go out and vote as well.

Social pressure like this can be quite potent, and the company has often deployed this button for experimental ends. In 2010, researchers at the University of California used the button and internal Facebook data to conduct a “61-million-person experiment in social influence and political mobilization.” They found that someone was 0.39 percent more likely to vote if they were told by Facebook that their friends had voted. Because of the social ripple effects of this, they concluded that more than 340,000 additional votes were cast in that midterm election because of the “I Voted!” button.

In the text of the study, the authors added that voter-turnout efforts like Facebook’s could have changed the outcome of the 2000 presidential election:

Voter mobilization experiments have shown that most methods of contacting potential voters have small effects (if any) on turnout rates, ranging from 1 percent to 10 percent. However, the ability to reach large populations online means that even small effects could yield behavior changes for millions of people. Furthermore, as many elections are competitive, these changes could affect electoral outcomes. For example, in the 2000 U.S. presidential election, George Bush beat Al Gore in Florida by 537 votes (less than 0.01 percent of votes cast in Florida). Had Gore won Florida, he would have won the election.

If Facebook’s effects on voter turnout are as large as this research suggests, then Facebook could easily skew the 2016 election. By selectively presenting the “I Voted!” button to some voters, for instance, it could juice turnout among reliably Democratic demographics without increasing it among their Republican counterparts. As my colleague Derek Thompson has detailed, “the single best predictor of Trump support in the GOP primary is the absence of a college degree.” Facebook knows many of our educational histories all too well. By only encouraging educated users to head to the polls—or by only inspiring urban voters in some states—it could change the contest.

To be clear, the company has repeatedly said it has no appetite to do this. “Facebook would never try to control elections,” Sheryl Sandberg, its chief operating officer, told an Indian television network in 2014.

Jonathan Zittrain, a law and computer science professor at Harvard University who has previously written about Facebook’s electoral power, told me it was good that Facebook was now on the record about not tampering with the vote. He confirmed that no legal mechanism would prevent them from trying it.

“Facebook is not an originator of content so much, it is a funnel for it. And because it is a social network, it’s got quite natural market dominance,” he said. With that power came a need for public concern and awareness.

Questions like these will keep coming up until Facebook and other major technology companies take their role as an “information fiduciary” more seriously, he said. “There’s no such thing as a neutral News Feed,” he told me. As a fiduciary, the platform would forswear “trying to advance its agenda over yours and over what you want to see happen in the world.”

“All of the mass media work of the ‘70s and ‘80s was about worrying that, ‘My god, they’re only three major networks, and they’re kind of all the same!’ It’s funny—they were ringing the bell then, and it turned out not to be so much of a big deal, especially as cable came onto the scene. And now it’s like—you guys, you might want to come back!’”

For now, Mark Zuckerberg seems content to intervene in the election as any other person would—by commenting on it. At Facebook’s developers conference last week, he condemned Trump without ever using his name.

“I hear fearful voices calling for building walls and distancing people they label as ‘others.’ I hear them calling for blocking free expression, for slowing immigration, for reducing trade, and in some cases even for cutting access to the Internet. It takes courage to choose hope over fear,” he said. “If the world starts turning inward, our community will just have to work harder to bring people together.”