We often talk about algorithms like they're people. Algorithms take action. They think. They make decisions. Sometimes they even offend us and we tell them so: "Um, no thanks, Facebook." "Whoa, that's creepy, Google."
So why not try to mess with those "people"? They're messing with us, after all. So the other day I tried to trick Facebook, testing whether I could force a post to the top of my friends' News Feeds.
Part of the impetus was that Facebook had frustrated me. That morning I'd posted a story I'd written about the hunt for electric bacteria that might someday power remote sensors. After a few hours, the story had garnered just one like. I surmised that Facebook had decided that, for whatever reason, what I'd submitted to the blue ether wasn't what people wanted, and kept it hidden.
A little grumpy at the idea, I wanted to see if I could trick Facebook into believing I'd had one of those big life updates that always hang out at the top of the feed. People tend to word those things roughly the same way and Facebook does smart things with pattern matching and sentiment analysis. Let's see if I can fabricate some social love.
I posted: "Hey everyone, big news!! I've accepted a position trying to make Facebook believe this is an important post about my life! I'm so excited to begin this small experiment into how the Facebook algorithms processes language and really appreciate all of your support!"
The first like and comment came almost instantly. I liked back. Then a few more. People were playing along. I liked them all back. Then momentum began to pick up: You could almost feel two great blue hands ratcheting the post up my friends' feeds. Then victory: Around the 39-minute mark after I published the status update, my friend Casey told me my status—rather than possible updates from about 1,000 friends—was at the top of his feed. Nine minutes later, another friend confirmed the same. More and more people said the post was firmly at the top of their feed—and not just (actual) friends, but former colleagues I hadn't talked to in years. After 90 minutes, the post had 57 likes and 25 commenters.
For the next two days, the likes and comments poured in, and people reported my status was still at the top of their feed. (Some even asked how to make it go away.) As of this writing, it has 134 likes and 62 comments.
But what had I really done?
Later, people asked me, "What do you think elevated the post?" Likes, comments, the number of people who made them, what they said, velocity, timing, my profile—all of that, in some way, played a part. The real answer, of course, is the algorithm. It is written in a wild array of numbers, symbols and computer-speak that only a handful of Facebook employees understand. And as I watched my faux-news go, I felt how that combination of numbers and signals and people starts to resemble one giant interconnected neural network—like a vibrant MRI of a human brain as someone plays chess or has sex.
When I generated a signal in the status bar—Facebook's eyes and ears—its language processing likely assessed the sentiment of my words and quickly signaled it as potentially good. That signal was passed along to a small test group of Facebook users whose brains told them they liked the post and clicked accordingly. Facebook tracked their clicks and, in turn, decided to bump the stimulus to more users. This happened a few more times and, after enough validation, the signal was fortified in the top of many, many feeds as one of the seminal statuses of the day.
Of course I'm guessing that's how it happened. Deciphering the exact levers and pulleys in Facebook's algorithms is impossible from the outside. The best you'll ever get in trying to experiment with successful posts on Facebook is feeble correlation. Cause—the software code—is under lock and key. We don't know why some posts go up or down other than what we can surmise. Like Google with search, tech companies don't share their secret sauce.And that was the rub. At first I was pretty proud of myself for messing with Facebook's algorithms. But after a little reflection I couldn't escape the feeling I hadn't really gamed anything. I'd created a joke that a lot of people enjoyed. They signaled their enjoyment, which gave Facebook the confidence to show the enjoyable joke to more people. There was nothing "incorrect" about that fake news being at the top of people's feeds. The system—in its murky recursive glory—did what it was supposed to do. And on the next earnings call Mark Zuckerberg can still boast high user engagement numbers.