Facebook is trying to reassure its users after a controversial experiment that purposefully manipulated their emotions. But the company has a long way to go to win over privacy advocates.
Facebook announced new guidelines Thursday for future research projects. Any studies that focus on particular groups (such as people of a particular age) or concern issues that are "deeply personal" will have to be reviewed by senior-level managers, the company said. Facebook also promised to better train its employees on appropriate research practices and will post its published academic research online.
But in a blog post announcing the changes, the company was defiant about the importance of studying users and didn't promise to get their consent or even notify them if they are subjects of an experiment.
"We believe in research, because it helps us build a better Facebook," Mike Schroepfer, the company's chief technology officer, wrote. "Like most companies today, our products are built based on extensive research, experimentation, and testing."
In June, Facebook acknowledged that its data scientists conducted a massive experiment in January 2012 on nearly 700,000 users without their knowledge. For one week, the researchers manipulated the amount of positive and negative content in users' News Feeds to study how they responded. The researchers found that users who saw less positive content were more likely to post negative updates themselves.