How Much Should You Know About How Facebook Works?

An author of the infamous emotional-contagion study says it might not be reasonable to expect informed consent about social experiments online.
More

Every semester, Cornell professor Jeff Hancock asks his students to complete an experiment. First, he has them all Google the same search term. Then, he asks each student to turn to the right or left and compare the results on their screens.

What his students inevitably find, and what stuns many of them, he says, is how feeding Google an identical phrase can yield wildly different results. "They think your Google search is an objective window into the world," Hancock told me. "And they don't have a sense that they're algorithmically curated."

Daily life is filthy with algorithms. Online personalized ads and filtered streams of information change based on where we go and what we click. Offline, we receive coupons tied to our past shopping behaviors and credit-card offers based on financial history. The invisible systems that are built around our mined personal information and tracked activities are everywhere, generating data doppelgangers that can look frighteningly like us or hilariously divergent from the versions of ourselves we think we know

But the traceless infrastructure that hides and surfaces the information all around us is only as scary as any major shift in technology that came before, argues Hancock, who found himself at the center of a research controversy earlier this summer. Hancock co-authored a now infamous study about a secret Facebook experiment he and other researchers constructed to study emotional contagion. The work involved changing what users saw in their News Feeds as a way to manipulate their emotional states

When news of the study spread in June, people were outraged. (Hancock says he was still receiving physical threats in response to the study as recently as last week.) Hancock now says he's prioritizing conversations—with academics, policy makers, and others—to move forward so that people "don't feel wronged or upset" by this kind of work. It's a process he expects will take years. The emails he gets these days are still angry, but the stream of them has slowed.

I first asked Hancock to talk about the experiment back in June, but he wanted to wait until some of the media attention waned. We spoke for the first time this week.

* * * 

"You have this algorithm which is a weird thing that people don't really understand," Hancock told me. "And we haven't discussed it as a society very much."

Hancock is still reluctant to draw too many conclusions about what he learned in the aftermath of the Facebook study. He declined to talk on-record about what he might do differently next time, or to detail advice he'd have for someone conducting a similar experiment. 

One of Hancock's main areas of research has to do with "deception and its detection," according to his university website, a detail that people have asked him about, he says. "'You study deception and obviously you were super deceptive in this study'—That has come up in a few emails," he said. 

"There is a trust issue around new technologies," he continued, "It goes back to Socrates and his distrust of the alphabet, [the idea that] writing would lead to us to become mindless ... It's the same fear, I think. 'Because I can't see you, you're going to manipulate me, you're going to deceive me.' There could be a connection there where there's a larger trust issue around technology." For now, Hancock says, he just wants to better grasp the way that people are thinking about algorithms. Understanding expectations will help him and others figure out the ethical ways to tinker with the streams of information that reach them. 

"For me, since the Facebook study controversy and the reaction, we've just started asking in the lab, 'Well, what are people's mental models for how a News Feed is created? How a Google search list is created? How iTunes rankings for songs work?'" he said. "When you step back, it's almost like every large company that is consumer-facing has algorithms working to present its data or products to the users. It's a huge thing." 

So huge, and so much a part of the way the Internet works, Hancock suggests, that we may have passed the point where it's possible for people to reasonably expect they'd have to give consent before a corporation messes with the algorithmic filters that affect the information they see online.

That issue of consent was one of the biggest questions to emerge from the Facebook study. Is it enough to only notify users in a terms-of-service agreement that they might be subject to the experimental whims of a company? (It appears the Facebook study may not have even done that.) 

Jump to comments
Presented by

Adrienne LaFrance

Adrienne LaFrance is a senior associate editor at The Atlantic, where she oversees the Technology Channel. Previously she worked as an investigative reporter for Honolulu Civil Beat, Nieman Journalism Lab, and WBURMore

Her writing has appeared in The New York Times, The Washington Post, Gawker, The Awl, and several other publications. 
Get Today's Top Stories in Your Inbox (preview)

A Technicolor Time-Lapse of Alaska's Northern Lights

The beauty of aurora borealis, as seen from America's last frontier


Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

A Time-Lapse of Alaska's Northern Lights

The beauty of aurora borealis, as seen from America's last frontier

Video

What Do You Wish You Learned in College?

Ivy League academics reveal their undergrad regrets

Video

Famous Movies, Reimagined

From Apocalypse Now to The Lord of the Rings, this clever video puts a new spin on Hollywood's greatest hits.

Video

What Is a City?

Cities are like nothing else on Earth.

Video

CrossFit Versus Yoga: Choose a Side

How a workout becomes a social identity

Video

In Online Dating, Everyone's a Little Bit Racist

The co-founder of OKCupid shares findings from his analysis of millions of users' data.

Writers

Up
Down

More in Technology

Just In