"It's the trickiest question in some ways because informed consent is a really important principle, the bedrock of a lot of social science, but it can be waived when the intervention, the test, is minimally risky and below a threshold risk," Hancock told me. "Informed consent isn't the be-all end-all of how to do social science at scale, especially when corporations are involved."
Instead, Hancock suggests, perhaps it makes more sense to debrief users after an experiment has taken place. Such notice might link to more information about the study, and offer contact information for researchers or an ombudsman rather than "bombarding people upfront with requests for consent."
But beyond that, Hancock insists, opting-out ahead of time simply may not be an option. When algorithms are everywhere, and when companies are constantly refining them in different ways and for various reasons, what would obtaining consent even look like?
"If you think about Google search and you say, 'I don't want to be experimented on,' then the question is, well, what does that mean?" Hancock said. "Google search is presumably being tested all the time ... They're constantly needing to tweak their algorithm. If I say, 'I want to opt out of that,' does that put me back to Google search 2004? Once you say 'I'm opting out,' you're stuck with that version of that algorithm.
"And that's the thing," he continued. "Once you start thinking about it, how does an opt-out situation even work? These companies are innovating on a weekly basis, maybe faster, and how do we allow people to opt out? Do we allow an inferior product to go out?"
* * *
Of course, there is a clear difference between the mysterious mechanics of Google's search algorithm and the deliberate filtering of a News Feed to manipulate emotions. How much does algorithmic intent matter?
"No, not all algorithms are equal," said Nicholas Diakopoulos, a computational journalism fellow at the Tow Center for Digital Journalism. Diakopoulos says he didn't have a problem with the Facebook study—the outrage was over the top, he says—but he sees "some things they could have done better."
"If we can agree that we're beyond the state where we can expect people to have informed consent for every little A/B test, I would like to see people have debriefing ... so that they know they were manipulated," Diakopoulos told me.
There is little consensus about this in the scientific community. Last month, Kate Crawford—a principal researcher at Microsoft—argued in these pages that users should be able to opt in to experimental groups. "It is a failure of imagination and methodology to claim that it is necessary to experiment on millions of people without their consent in order to produce good data science," Crawford wrote.
And arguments about the Facebook study have ranged far beyond questions of consent. Some of the research’s defenders have said that insisting social publishers ask permission before showing you different content demonstrates a false understanding of how the Internet actually works. Everyone knows that filters are imposed on information streams online, goes the argument. Indeed, filters are part of the Internet. To alter them is to alter the web.