This rebalancing means different things for the company’s many stakeholders—for publishers, it means they’re almost certainly going to be punished for their reliance on a platform that’s never been a wholly reliable partner. Facebook didn’t talk to publishers in Slovakia because publishers are less important than other stakeholders in this next incarnation of Facebook. But more broadly, Facebook doesn’t talk to you because Facebook already knows what you want.
Facebook collects information on a person’s every interaction with the site—and many other actions online—so Facebook knows a great deal about what we pay attention to. People say they’re interested in a broad range of news from different political preferences, but Facebook knows they really want angry, outraged articles that confirm political prejudices.
Publishers in Slovakia and in the United States may warn of damage to democracy if Facebook readers receive less news, but Facebook knows people will be perfectly happy—perfectly engaged—with more posts from friends and families instead.
For Facebook, our revealed preferences—discovered by analyzing our behavior—speak volumes. The words we say, on the other hand, are often best ignored. (Keep this in mind when taking Facebook’s two-question survey on what media brands you trust.)
Tristan Harris, a fierce and persuasive critic of the ad-supported internet, recently offered me an analogy to explain a problem with revealed preferences. I pledge to go to the gym more in 2018, but every morning when I wake up, my partner presents me with a plate of donuts and urges me to stay in bed and eat them. My revealed preferences show that I’m more interested in eating donuts than in exercising. But it’s pretty perverse that my partner is working to give me what I really crave, ignoring what I’ve clearly stated I aspire to.
Facebook’s upcoming newsfeed change won’t eliminate fake news ... at least, it didn’t in Slovakia. People share sensational or shocking news, while more reliable news tends not to go viral. When people choose to subscribe to reliable news sources, they’re asking to go to the gym. With these News Feed changes, Facebook threw out your gym shoes and subscribed you to a donut-delivery service. Why do 2 billion people put up with a service that patronizingly reminds them that it’s designed for their well-being, while it studiously ignores our stated preferences? Many people feel like they don’t have a choice. Facebook is the only social network, for example, where I overlap with some of my friends, especially those from my childhood and from high school.
I don’t want Facebook to go away—I want it to get better. But increasingly, I think the only way Facebook will listen to people’s expressed preferences is if people start building better alternatives. Right now, Facebook chooses what stories should top your News Feed, optimizing for “engagement” and “time well spent.” Don’t like the choices Facebook is making? Too bad. You can temporarily set Facebook to give you a chronological feed, but when you close your browser window, you’ll be returned to Facebook’s paternalistic algorithm.