ASPEN, Colo.—During a session on freedom of speech at the Aspen Ideas Festival, hosted by the Aspen Institute and The Atlantic, Facebook's Head of Global Policy Management, Monika Bickert, was asked about the emotion-manipulation study that has been a subject of controversy over the past few days.
"Do you see some regulation about this," an audience member asked, "and how free speech might be influenced by what users of social networks are shown?" What if, he continued, governments began asking Facebook to do that kind of manipulation not for science, but for politics—to affect, essentially, the moods of their citizens by asking the company to influence the content those people are shown?
"You point out a policy issue," Bickert replied. And it's one, she said, that gets at "the tension between legislation and innovation."
"I'm not really the best expert," Bickert noted by way of caveat, "and probably our public statements are the best source of information there." That said, though: "I believe that was a week's worth of research done in 2012."
And that research, she continued, was done in the name of platform-improvement. As, essentially, customer service. "Most of the research that is done on Facebook—if you walk around campus and you listen to the engineers talking—is all about, 'How do we make this product better? How do we better suit the needs of the population using this product, and how do we show them more of what they want to see, and less of what they don't want to see?'"