How Much Should You Know About the Way Facebook Works?

An author of the infamous emotional-contagion study says it might not be reasonable to expect informed consent about social experiments online.

Every semester, Cornell professor Jeff Hancock asks his students to complete an experiment. First, he has them all Google the same search term. Then, he asks each student to turn to the right or left and compare the results on their screens.

What his students inevitably find, and what stuns many of them, he says, is how feeding Google an identical phrase can yield wildly different results. "They think your Google search is an objective window into the world," Hancock told me. "And they don't have a sense that they're algorithmically curated."

Daily life is filthy with algorithms. Online personalized ads and filtered streams of information change based on where we go and what we click. Offline, we receive coupons tied to our past shopping behaviors and credit-card offers based on financial history. The invisible systems that are built around our mined personal information and tracked activities are everywhere, generating data doppelgangers that can look frighteningly like us or hilariously divergent from the versions of ourselves we think we know.

But the traceless infrastructure that hides and surfaces the information all around us is only as scary as any major shift in technology that came before, argues Hancock, who found himself at the center of a research controversy earlier this summer. Hancock co-authored a now infamous study about a secret Facebook experiment he and other researchers constructed to study emotional contagion. The work involved changing what users saw in their News Feeds as a way to manipulate their emotional states.

When news of the study spread in June, people were outraged. (Hancock says he was still receiving physical threats in response to the study as recently as last week.) Hancock now says he's prioritizing conversations—with academics, policy makers, and others—to move forward so that people "don't feel wronged or upset" by this kind of work. It's a process he expects will take years. The emails he gets these days are still angry, but the stream of them has slowed.

I first asked Hancock to talk about the experiment back in June, but he wanted to wait until some of the media attention waned. We spoke for the first time this week.

* * *

"You have this algorithm which is a weird thing that people don't really understand," Hancock told me. "And we haven't discussed it as a society very much."

Hancock is still reluctant to draw too many conclusions about what he learned in the aftermath of the Facebook study. He declined to talk on-record about what he might do differently next time, or to detail advice he'd have for someone conducting a similar experiment.

One of Hancock's main areas of research has to do with "deception and its detection," according to his university website, a detail that people have asked him about, he says. "'You study deception and obviously you were super deceptive in this study'—That has come up in a few emails," he said.

"There is a trust issue around new technologies," he continued, "It goes back to Socrates and his distrust of the alphabet, [the idea that] writing would lead to us to become mindless ... It's the same fear, I think. 'Because I can't see you, you're going to manipulate me, you're going to deceive me.' There could be a connection there where there's a larger trust issue around technology." For now, Hancock says, he just wants to better grasp the way that people are thinking about algorithms. Understanding expectations will help him and others figure out the ethical ways to tinker with the streams of information that reach them.

"For me, since the Facebook study controversy and the reaction, we've just started asking in the lab, 'Well, what are people's mental models for how a News Feed is created? How a Google search list is created? How iTunes rankings for songs work?'" he said. "When you step back, it's almost like every large company that is consumer-facing has algorithms working to present its data or products to the users. It's a huge thing."

So huge, and so much a part of the way the Internet works, Hancock suggests, that we may have passed the point where it's possible for people to reasonably expect they'd have to give consent before a corporation messes with the algorithmic filters that affect the information they see online.

That issue of consent was one of the biggest questions to emerge from the Facebook study. Is it enough to only notify users in a terms-of-service agreement that they might be subject to the experimental whims of a company? (It appears the Facebook study may not have even done that.)

Presented by

Adrienne LaFrance is a senior associate editor at The Atlantic, where she oversees the Technology Channel. Previously she worked as an investigative reporter for Honolulu Civil Beat, Nieman Journalism Lab, and WBUR. More

Her writing has appeared in The New York Times, The Washington Post, Gawker, The Awl, and several other publications.

How to Cook Spaghetti Squash (and Why)

Cooking for yourself is one of the surest ways to eat well. Bestselling author Mark Bittman teaches James Hamblin the recipe that everyone is Googling.

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register.

blog comments powered by Disqus

Video

How to Cook Spaghetti Squash (and Why)

Cooking for yourself is one of the surest ways to eat well.

Video

Before Tinder, a Tree

Looking for your soulmate? Write a letter to the "Bridegroom's Oak" in Germany.

Video

The Health Benefits of Going Outside

People spend too much time indoors. One solution: ecotherapy.

Video

Where High Tech Meets the 1950s

Why did Green Bank, West Virginia, ban wireless signals? For science.

Video

Yes, Quidditch Is Real

How J.K. Rowling's magical sport spread from Hogwarts to college campuses

Video

Would You Live in a Treehouse?

A treehouse can be an ideal office space, vacation rental, and way of reconnecting with your youth.

More in Technology

Just In