Wallace and Farid announced the program to a group of reporters last week, and indicated that it would be ready to deploy imminently. Farid said his software would be ready within months, and Wallace said that the project’s leaders had “very collegial discussions” with social-media companies about adopting the new software. “I don’t want to get too over my skis here, but I think there’s a lot of interest,” he said.
But to hear those companies tell it, the proposal is far from the brink of adoption. Although there have been months of conversations among the platforms that are most likely to use the software, lingering questions and a history of resentment toward Wallace and his organization have thrown up roadblocks.
The conversation began in earnest in late April, when Monika Bickert, Facebook’s head of global policy management, organized a conference call with social-media companies to discuss how to deal with terrorist material on their sites. According to company representatives familiar with the discussion, Bickert shared details about a handful of tools that would flag extremist content online. Although Bickert never mentioned CEP, Wallace, or Farid by name, one of the proposals she circulated was identical to the one CEP introduced Friday.
That conversation was polite and productive, but in private discussions, some participants raised concerns about the plan—and about working with Mark Wallace, who has long been a gadfly circling social-media companies and pushing them to police their newsfeeds and timelines for extremism.
The companies are mainly concerned about how to determine the terrorist content that would get flagged. While NCMEC’s database of child-porn hashes is made up of illegal images as defined by the law, it’s harder to establish exactly what constitutes extremist content. Many countries define it very broadly, using the label of “terrorist” to silence dissent or opposition.
The job of deciding what counts as extremist and what doesn’t would fall to NOREX. The images, videos, and audio clips that NOREX determines are extremist would be flagged on participating social media sites, regardless of the context they were posted in.
“Unlike child pornography—which is itself illegal—identifying certain types of speech requires context and intent,” said Lee Rowland, a senior staff attorney at the American Civil Liberties Union. “Algorithms are not good at determining context and tone like support or opposition, sarcasm or parody.”
(Wallace says the proposed system would have some sort of appeals process, whereby a user notified that his or her post was flagged could submit a counterclaim for human review. Farid also proposed exempting some accounts—those operated by media companies, for example—that could post extremist content without repercussion for educational or news purposes.)