Facebook Promises to Manipulate Your Emotions 'Differently'

The new guidelines are unlikely to appease privacy advocates, who argue users should consent to experiments.

In this photo illustration the Social networking site Facebook is reflected in the eye of a man on March 25, 2009 in London, England. The British government has made proposals which would force Social networking websites such as Facebook to pass on details of users, friends and contacts to help fight terrorism. (National Journal)

Facebook is trying to reassure its users after a controversial experiment that purposefully manipulated their emotions. But the company has a long way to go to win over privacy advocates.

Facebook announced new guidelines Thursday for future research projects. Any studies that focus on particular groups (such as people of a particular age) or concern issues that are "deeply personal" will have to be reviewed by senior-level managers, the company said. Facebook also promised to better train its employees on appropriate research practices and will post its published academic research online.

But in a blog post announcing the changes, the company was defiant about the importance of studying users and didn't promise to get their consent or even notify them if they are subjects of an experiment.

"We believe in research, because it helps us build a better Facebook," Mike Schroepfer, the company's chief technology officer, wrote. "Like most companies today, our products are built based on extensive research, experimentation, and testing."

In June, Facebook acknowledged that its data scientists conducted a massive experiment in January 2012 on nearly 700,000 users without their knowledge. For one week, the researchers manipulated the amount of positive and negative content in users' News Feeds to study how they responded. The researchers found that users who saw less positive content were more likely to post negative updates themselves.

In Thursday's post, Schroepfer said it was important for the company to investigate how reading Facebook posts makes users feel about themselves. But he said the company was "unprepared" for the backlash to its study and "there are things we should have done differently."

For example, Facebook should have explored nonexperimental ways to explore the issue, and should have required a more extensive internal review, he said.

But the apology and new guidelines are unlikely to win much praise from privacy groups.

"This is typical Facebook behavior to announce a half step to get the political heat off," said Jeff Chester, the executive director of the Center for Digital Democracy.

He argued that Facebook should disclose more information to users about how it conducts research to better serve advertising and that users should be able to opt out of any experiments.

"Facebook routinely engages in research on its users to better perfect how it serves its advertisers," he said. "All of this should be disclosed and a user should decide whether they can be part of any research effort."

The Electronic Privacy Information Center has filed a complaint with the Federal Trade Commission, claiming Facebook's emotional-manipulation experiment illegally deceived users.

It would be an awkward time for Facebook to get any scrutiny from the FTC as it is rolling out a major expansion of its advertising network. The company said this week it plans to use its troves of data on users to target ads to them on other sites.