Deciphering Facebook's Software Philosophy

Let’s overthink its “News Feed Values.”

Shailesh Andrade / Reuters

Last week, Facebook offered a peek into the philosophy governing its News Feed algorithm, the piece of software that decides which posts are shown to people when they log into the platform’s app or homepage. The announcement was more than just academic. One in five adults worldwide use Facebook, and 44 percent of Americans get their news from the platform. If traditional agenda-setting news barons like Rupert Murdoch count as powerful, then surely the News Feed algorithm wields influence, too. In fact, its algorithm may be one of the most powerful pieces of software in the world.

Which makes the ideas governing such a piece of software extra-important. These particular ideas came in a blog post entitled “News Feed Values,” written by Adam Mosseri, a Facebook vice president and the product manager of the News Feed. The post is a list of broad principles and vague promises that users should expect from their News Feed. It was at once a piece of marketing and—more interestingly—a set of operational ethics, a kind of guide to what Facebook values when it decides to alter the feed.

And Facebook really does seem to rely on some of these ideas in more than nominal ways. Near the beginning of the blog post, for instance, Mosseri asks: “If you could look through thousands of stories every day and choose the 10 that were most important to you, which would they be? The answer should be your News Feed.”

Underlying this question is the conviction that the News Feed shouldn’t just entertain users, but that it should entertain them against replacement. That is, it should provide significantly more meaning and entertainment than an average piece of entertainment. (In the U.S., that means it should be more fun than, say, watching an episode of NCIS.) I’ve heard this notion in conversations with other Facebook corporate leaders; it’s sabermetrics as editorial vision, deployed at scale worldwide, and it shapes how they think about one of their most important products. It’s also, I think, a convenient out for them, a way of handling the editorial burden of having the most popular app in the world.

Mosseri’s post is something else, too: a warning to news outlets. Many journalists have interpreted it as the final notification from Facebook that the News Feed algorithm will no longer emphasize news content, as it has since late 2013. For online publishers who have invested on the back of Facebook’s incredible audience gains, this is deeply worrying. (And everyone has invested on the back of these gains, to some degree, as the Facebook-triggered traffic inflation of the last few years which has in turn distorted the online advertising market.)

We’ll see whether those anxieties pan out. (The relevant changes may have already taken effect: News traffic from Facebook has been declining for months.) But in the meantime, I wanted to spend some time reading Facebook’s News Feed Values in the manner they purport to be written. Mosseri and his team claim to have created a set of unique, data-driven editorial ethics. They seem to govern one of the most powerful media companies in the world. They deserve our scrutiny.

So what does Facebook value? And more importantly, what aren’t they thinking about?

*  *  *

News Feed Values
By Adam Mosseri, VP, Product Management, News Feed

Our success is built on getting people the stories that matter to them most. If you could look through thousands of stories every day and choose the 10 that were most important to you, which would they be? The answer should be your News Feed. It is subjective, personal, and unique—and defines the spirit of what we hope to achieve.

Here is the core conceit of News Feed: A piece of software can reasonably approximate someone’s subjectivity and uniqueness, and guess what stories they will find most important. These guesses should be more than just reasonable, in fact. They should be the most relevant stories possible.

It feels almost tedious to say this, but this whole idea assumes that meaningfulness is something like an inherent, preexistent quantity, resident in words, photos, and friendships. To Facebook, meaningfulness doesn’t emerge unexpectedly from actions or connections—or, if it does, that’s not the kind of meaning that the company is interested in. Meaningfulness is stock, not flow. It’s a measurable commodity, with sources and origins that are ultimately predictable; the same kinds of people and posts will provide meaning over time.

Facebook was built on the idea of connecting people with their friends and family. That is still the driving principle of News Feed today. Our top priority is keeping you connected to the people, places and things you want to be connected to — starting with the people you are friends with on Facebook.

Here is the big (and widely reported) news in Mosseri’s post: that Facebook will now promote posts from users’ friends and family above all other kinds of content.

Though News Feed is an independent team in the company, this change seems related to a Facebook-wide decline in “original sharing.” That is, regular people are posting fewer and fewer statuses and photos—a potentially catastrophic issue for the social network. It’s plausible that showing more of those “original” posts would trigger more original posting.

But there’s other information in this paragraph too. For instance, how does Facebook know you “want to be” connected to the “people, places, and things” that it thinks you want to be connected to? The language in the post is kind of messy. Presumably, if you actually want to be connected to someone—or see their updates—you’ll type their name into the Facebook search bar and go to their page. (Or you’ll mark them as a top friend—or ask Facebook to put their stories at the top of your feed.) What News Feed is actually doing is sniffing out who it thinks you would want to be connected to. It’s a small difference, admittedly, but the gap between “want to be” and “would hypothetically want to be” is at the heart of the News Feed project.

That’s why if it’s from your friends, it’s in your feed, period — you just have to scroll down. To help make sure you don’t miss the friends and family posts you are likely to care about, we put those posts toward the top of your News Feed. We learn from you and adapt over time. For example, if you tend to like photos from your sister, we’ll start putting her posts closer to the top of your feed so you won’t miss what she posted while you were away.

Our research has also shown us that, after friends and family, people have two other strong expectations when they come to News Feed:

  • Your feed should inform. People expect the stories in their feed to be meaningful to them — and we have learned over time that people value stories that they consider informative. Something that one person finds informative or interesting may be different from what another person finds informative or interesting — this could be a post about a current event, a story about your favorite celebrity, a piece of local news, or a recipe. We’re always working to better understand what is interesting and informative to you personally, so those stories appear higher up in your feed.

Facebook fully offshores its editorial judgment here, noting only that what “one person finds informative or interesting may be different from what another person finds informative or interesting.” Which makes some sense. But in doing so, it also offloads the burden of judging truth. It’s not making any promise that what one person finds informative will be accurate (nor do I think it could). It’s only promising that it will supply … content that … informs.

How does it figure out which posts are informative? What does it even mean by informative here, if it admits that informativeness can differ by person? Your guess is as good as mine.

  • Your feed should entertain. We’ve also found that people enjoy their feeds as a source of entertainment. For some people, that’s following a celebrity or athlete; for others it’s watching Live videos and sharing funny photos with their friends. We work hard to try to understand and predict what posts on Facebook you find entertaining to make sure you don’t miss out on those.

Obviously, there’s a tension between entertaining and informing—which points to why “infotainment” comedy like John Oliver and Samantha Bee do so well on the platform.

We are not in the business of picking which issues the world should read about. We are in the business of connecting people and ideas — and matching people with the stories they find most meaningful. Our integrity depends on being inclusive of all perspectives and view points, and using ranking to connect people with the stories and sources they find the most meaningful and engaging.

We don’t favor specific kinds of sources — or ideas. Our aim is to deliver the types of stories we’ve gotten feedback that an individual person most wants to see. We do this not only because we believe it’s the right thing but also because it’s good for our business. When people see content they are interested in, they are more likely to spend time on News Feed and enjoy their experience.

This is a particularly dense section. Notice that Facebook again avoids the burden of deciding what is meaningful, leaving it all to the user. Mosseri alludes to how Facebook determines meaning without judging it (by“using ranking”), and explains why that approach is so important (“it’s good for our business”). He doesn’t go into details, though—even though the method is exactly how it figures out a user’s hypothetical wants.

Mosseri also reaffirms that it wants to be an apolitical platform, something it promised during the “Trending News” bias scandal earlier this year.

It’s important to note that while we welcome a multitude of viewpoints, we also believe strongly that people should feel — and be — safe when they use Facebook, and we therefore have Community Standards that define the behavior that we think is out-of-bounds on the platform. We think it’s possible to be inclusive without making Facebook a place where people are subjected to attacks, hate, or other harmful behavior.

And this is how Facebook threads the needle on purporting to have no editorial vision but not allowing hate speech.

The strength of our community depends on authentic communication. The feedback we’ve gotten tells us that authentic stories are the ones that resonate most. That’s why we work hard to understand what type of stories and posts people consider genuine — so we can show more of them in News Feed. And we work to understand what kinds of stories people find misleading, sensational and spammy, to make sure people see those less.

I don’t know where to start here. What does “authentic” mean? Does it mean that something like: Someone reflects their “truest” self in their post on Facebook? If so, how does Facebook know what my truest self is? I don’t even know what my truest self is. Are News Feed algorithm writers reading Charles Taylor?

Ultimately, you know what’s most meaningful to you — and that’s why we’ve developed controls so you can customize what you see. Features such as “unfollow,” “hide” and “see first” help you design your own experience — and when you use them, we take your actions as feedback to help us better understand what content is most important to you. For example, if you hide a story from someone, that signals that you’re less interested in hearing from that person in the future. As News Feed evolves, we’ll continue building easy-to-use and powerful tools to give you the most personalized experience.

This actually reflects an evolution on Facebook’s part. The company used to hide most of the mechanisms that let people administer their own News Feeds. Now, it is increasingly open about revealing them.

Some thinkers, like Harvard’s Jonathan Zittrain, think Facebook should be asked to go one step further: It should let anyone run their own News Feed-like ranking algorithm on Facebook. In other words, Facebook would provide the raw material (the generic mass of posts from friends and pages), and users could bring their own algorithmic editor.

We view our work as only 1 percent finished — and are dedicated to improving along the way. As we look for ways to get better, we will continue soliciting feedback. We will be as open as we can — providing explanations in News Feed FYI wherever possible and looking for opportunities to share how we work.

Like love and liberalism, News Feed is a project that is never finished.