How Much Should You Know About the Way Facebook Works?

"It's the trickiest question in some ways because informed consent is a really important principle, the bedrock of a lot of social science, but it can be waived when the intervention, the test, is minimally risky and below a threshold risk," Hancock told me. "Informed consent isn't the be-all end-all of how to do social science at scale, especially when corporations are involved."

Instead, Hancock suggests, perhaps it makes more sense to debrief users after an experiment has taken place. Such notice might link to more information about the study, and offer contact information for researchers or an ombudsman rather than "bombarding people upfront with requests for consent."

But beyond that, Hancock insists, opting-out ahead of time simply may not be an option. When algorithms are everywhere, and when companies are constantly refining them in different ways and for various reasons, what would obtaining consent even look like?

"If you think about Google search and you say, 'I don't want to be experimented on,' then the question is, well, what does that mean?" Hancock said. "Google search is presumably being tested all the time ... They're constantly needing to tweak their algorithm. If I say, 'I want to opt out of that,' does that put me back to Google search 2004? Once you say 'I'm opting out,' you're stuck with that version of that algorithm.

"And that's the thing," he continued. "Once you start thinking about it, how does an opt-out situation even work? These companies are innovating on a weekly basis, maybe faster, and how do we allow people to opt out? Do we allow an inferior product to go out?"

* * *

Of course, there is a clear difference between the mysterious mechanics of Google's search algorithm and the deliberate filtering of a News Feed to manipulate emotions. How much does algorithmic intent matter?

"No, not all algorithms are equal," said Nicholas Diakopoulos, a computational journalism fellow at the Tow Center for Digital Journalism. Diakopoulos says he didn't have a problem with the Facebook study—the outrage was over the top, he says—but he sees "some things they could have done better."

"If we can agree that we're beyond the state where we can expect people to have informed consent for every little A/B test, I would like to see people have debriefing ... so that they know they were manipulated," Diakopoulos told me.

There is little consensus about this in the scientific community. Last month, Kate Crawford—a principal researcher at Microsoft—argued in these pages that users should be able to opt in to experimental groups. "It is a failure of imagination and methodology to claim that it is necessary to experiment on millions of people without their consent in order to produce good data science," Crawford wrote.

And arguments about the Facebook study have ranged far beyond questions of consent. Some of the research’s defenders have said that insisting social publishers ask permission before showing you different content demonstrates a false understanding of how the Internet actually works. Everyone knows that filters are imposed on information streams online, goes the argument. Indeed, filters are part of the Internet. To alter them is to alter the web.

Writer Tim Carmody pushed back against these ideas in a blog post earlier this month:

[Arguments like this are] all too quick to accept that users of [Facebook and OK Cupid] are readers who've agreed to let these sites show them things. They don't recognize or respect that the users are also the ones who've made almost everything that those sites show. They only treat you as a customer, never a client. […] Ultimately, [they] ought to be ashamed to treat people and the things they make this way.

It's not A/B testing. It's just being an asshole.

If people can't agree on what constitutes harmless A/B testing versus a serious breach of basic human rights, how do we move forward?

Determining a workable ethical framework is complex enough in a single industry, let alone across several disciplines interacting within algorithmic systems that are hidden to almost everyone who encounters them.

"I just think it's fascinating, the conflation of all these different areas of society," Diakopoulos said. "Whether it's business, or data journalism, they're colliding with the scientific method and quantification. What are the boundaries? When do we have to respect one set of ethics or set of expectations over another?"

Presented by

Adrienne LaFrance is a senior associate editor at The Atlantic, where she oversees the Technology Channel. Previously she worked as an investigative reporter for Honolulu Civil Beat, Nieman Journalism Lab, and WBUR. More

Her writing has appeared in The New York Times, The Washington Post, Gawker, The Awl, and several other publications.

Never Tell People How Old They Look

Age discrimination affects us all. Who cares about youth? James Hamblin turns to his colleague Jeffrey Goldberg for advice.

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register.

blog comments powered by Disqus

Video

Never Tell People How Old They Look

Age discrimination affects us all. James Hamblin turns to a colleague for advice.

Video

Would You Live in a Treehouse?

A treehouse can be an ideal office space, vacation rental, and way of reconnecting with your youth.

Video

Pittsburgh: 'Better Than You Thought'

How Steel City became a bikeable, walkable paradise

Video

A Four-Dimensional Tour of Boston

In this groundbreaking video, time moves at multiple speeds within a single frame.

Video

Who Made Pop Music So Repetitive? You Did.

If pop music is too homogenous, that's because listeners want it that way.

More in Technology

Just In