I’m Scared of the Person TikTok Thinks I Am

TikTok’s recommendation algorithm is known for its accuracy and even its “magic.” What does it mean if the videos it picks for you are totally disgusting?

A figure with a TikTok logo for a head, shrugging
The Atlantic

Something is wrong with me, and TikTok knows it.

I can tell because its recommendation algorithm keeps providing me with videos that only a horrible person would like. One morning last week, the app recommended a video of a girl in a red dress saying slowly, “I’m officially at the age where I can date you … or your dad.” In the next video, a “doctor” tried to sell me some kind of coffee-based weight-loss drink. An “age reveal” came after, from a woman who looked like she was in her 20s but was actually 43. Then a drinking game involving White Claw. An “alpha female.” A cop. A woman taking a teeth-whitening device out of her mouth and letting drool run down her chin. An art tutorial demonstrating how to paint a picture of your “A$$” as a surprise for your “man.” An Instagrammable rooftop. A tarot-card reading.

This is disturbing because the recommendation algorithm that TikTok uses to pull videos for the personalized “For You” feed is known for being scarily precise, to the point where intelligent people have expressed concern that it might be used for mind control. TikTok users often refer to the process that produces the feed in front of them as “my” algorithm, as if it really were an extension of self. So I ask—though I do not want to—what does my TikTok algorithm say about me? (And can it please say something else?)

The horror of a ruined TikTok feed struck a bunch of users in March, thanks to a momentary glitch that made For You feeds surface only videos with high view counts. People who had spent months or years cultivating the algorithm’s understanding of them as alt, specific, and cool now found themselves presented with content that was merely popular. “Every video I see now has over a million likes and they’re genuinely so boring,” one person complained on Reddit. Once the issue was resolved, their feeds went back to normal, and at least that was a communal experience. For the most part, though, we suffer from our feed alone, just as we scroll alone. A co-worker who did not want to be named told me that the algorithm had mistakenly thought, for some time, that he was “all in” on “horny for Hagrid” memes. A woman on Twitter who did not return my request for an interview wondered how she’d gotten stuck in “incest” TikTok.

Amy Rooker, a communications specialist from Michigan, downloaded TikTok at the beginning of the pandemic, and was surprised at how quickly it “perfectly pinpointed” who she was as a person. But she had a problem in January after she used TikTok to look for some quick job-interview tips—from then on, her feed was full of #girlbosses. “I go to TikTok as a reprieve for fun and often dumb content or my much-needed Harry Styles and cat content,” she told me. “I did not want it interrupted with a bunch of advice about how to get ahead in the business world.” She doesn’t remember how long it took for her feed to go back to normal, she said, but it required active work. She spent time searching for “Harry Styles” and “cats” and taking care to swipe past the girlbosses the second that she saw them.

Last summer, TikTok published an explanatory blog post titled “How TikTok Recommends Videos #ForYou,” seemingly in response to the widely held impression that its algorithm was magical—and to the anxieties that naturally arose whenever said magic fizzled. The recommendation system is designed “to continuously improve, correct, and learn,” the blog post said, and it values indicators such as whether a user watches a video from start to finish and whether they engage with it by sharing or adding a comment. If, like Rooker, you were seeing content that is not for you on your For You page, the post explained that you should long-press on the offending video and tap “Not Interested.” Every single thing you do helps the system learn, it said, “so the best way to curate your For You feed is to simply use and enjoy the app.”

Okay, don’t tell me to do more work, but I guess I could do that if I really wanted to hone my feed. Maybe the more interesting question is why I felt so embarrassed about having a bad feed in the first place. And why do people speak possessively of the algorithm, as if it were a child or a pet or a brain scan? (Isn’t that incorrect? It sounds incorrect.)

“When you say ‘my algorithm,’ it’s kind of right,” says Julian McAuley, an associate computer-science professor at UC San Diego who specializes in recommender systems. “There’s one algorithm, but input into the algorithm is everything you’ve historically done. So everybody is getting different recommendations based on their historical actions.” I told him about my problem, taking care to explain the extreme absurdity of watching that one lady paint a giant picture of her “A$$” and then present it to her boyfriend, who was obviously moved. Of course it’s true, I added, that I have watched this video several times, saved it to my phone, and texted it to every person I know. He laughed at me a little bit. (Thank you!) “People love to complain about the content they’re recommended, but it’s exactly the content you engage with,” he said. (I’m frowning.)

Recommendation algorithms can be very effective, but they actually know nothing about you, McAuley explained. They’re discovering patterns in your history and the combined histories of other users. “It seems creepy,” he said, “because they’re leveraging such huge volumes of data.” But it’s not really creepy. That feeling is just a result of the fact that any given user will be very similar to another set of users. “It’s not that you feel your algorithm is wrong,” McAuley said. “You feel that your historical actions aren’t representative of who you are.”

Then McAuley started talking about machine learning. Algorithms are optimized to achieve a specific outcome, he told me. That outcome is usually a click, or a share, or more time spent in the app. It’s not going to be some nebulous quality, like my long-term happiness. So, from that perspective, TikTok’s algorithm is actually serving me perfectly. It has learned to keep me clicking and sharing and scrolling, and I am doing that, jaw perpetually dropped. (Yesterday I received a video—like a little present!—of a pretty young woman driving her car and smirking. The text said “He blocked me on everything … So I applied at his job. See you in the morning.”)

There might be a way for TikTok to differentiate between sincere engagement and horrified engagement, McAuley said, but it wouldn’t be as useful for the company as it would be for me. “Why would they want to solve those problems?” he asked.

Lazar Odic, a public-relations student from Canada, was trapped like me.

When the COVID-19 vaccine rollout started in the United States, he noticed a few anti-vaccination conspiracy-theory videos in his feed. He found them so stupid that he had no choice but to comment on them, which meant that he saw more of them, and then he saw more videos that people who are into anti-vaccination conspiracy theories would also like. “The more it came across my screen, the more I’d comment,” he told me. “It doesn’t matter what you comment; TikTok seems to only care that you’ve engaged and then tries to feed you more of what you ‘seem’ to be wanting.” His feed was destroyed—just pure political garbage. He said that it was his own doing and he had only himself to blame. After months of rebuilding, which meant a lot of time spent carefully engaging with the kinds of comedy posts that he actually prefers, he says his feed has gotten a lot better. “There’s hope out there to return your algorithm to its former glory,” he told me.

Well, time to admit the real problem: I’ll never fix my feed because I don’t want to. I like being trapped in an algorithmic loop of disgust and confusion. I can’t stop myself from watching unsettling content all the way to the end, and I can’t stop myself from sharing it. It’s a little embarrassing that things have gone this far, but the embarrassment is fun.

As much as I might insist that my algorithm has nothing to do with me or my personality, TikTok actually has a pretty accurate sense of what I want to see. I don’t go online to laugh; I go online to scream. I like things that were made by people whose motivations are completely confounding to me. And I have to assume that there are many other people on TikTok who are looking for the same kind of experience, which is why TikTok is so reliable at providing it. When people complain that they’ve somehow “ruined” their personal algorithm, they probably know exactly what they’ve done. We are all working so hard every day at destroying one another’s brains.

The horrible corner of TikTok in which I have wedged myself provides a vantage point I wouldn’t otherwise have, and I get a real kick out of trying to drag other people there with me. Speaking of which, does anybody want to see a video of a girl in a SpongeBob SquarePants tank top, dancing in her backyard, celebrating the fact that it has been one full week since she was bitten by a dog?