TikTok Has a Problem

Why is the app so focused on abusive “investigations,” and is there any way to make it stop?

Illustration of a vortex composed of TikTok share buttons
Katie Martin / The Atlantic

When a person joins an online-dating app, and then starts texting some of the people they’ve met on that app, and then makes plans to hang out with some of those people in the hopes of making out, they have a reasonable, limited expectation of privacy.

Hardly anyone expects what happened to the mythological figure of “West Elm Caleb,” a bumbling villain of the New York dating scene and hapless victim of the internet. In January, after a couple of New York women with substantial TikTok followings discovered that they had been dating Caleb simultaneously, it quickly came out that he was guilty of other crimes—sending the same Spotify playlist to multiple people, for instance, and not returning text messages. One woman recalled how he had told her that he found it harder to go on dates in the winter, because of the cold. (She found this offensive.) Pretty soon brands were getting in on the West Elm Caleb conversation, as finding any excuse to talk about this pretty average dater in New York City became engagement-metric gold.

On TikTok, which is now the most popular web domain in the world, this phenomenon has become oddly repetitive. A few months before West Elm Caleb, the site-wide villain was Couch Guy—a guy who had been recorded sitting on a couch, looking sort of excited but not excited enough when surprised by a visit from his long-distance girlfriend. “During my tenure as Couch Guy,” Couch Guy later wrote in an essay for Slate, “I was the subject of frame-by-frame body language analyses, armchair diagnoses of psychopathy, comparisons to convicted murderers, and general discussions about my ‘bad vibes.’” Next up was Sabrina Prater, a trans woman from outside of Flint, Michigan, who a mob of TikTok users decided might be a “Buffalo Bill”–style serial killer based, similarly, on her “vibes.” On and on it goes: The platform generally known for dance trends and audio memes is also the site of serial “investigations,” in which users inflate the slightest signal into a source of outrage and obsession.

This problem is so rampant that it’s difficult to name. In just the past half year, TikTok mobs have dived headlong into engagement-baiting investigations of recent murders, online “pedophile rings,” and the legitimacy of popular creators’ neurological or psychiatric conditions. Emma Spiro, an associate professor in the Information School at the University of Washington, refers to these diverse events (with differing degrees of harm) as “mass convergences of attention.” Abbie Richards, a TikTok creator who also researches and writes reports about disinformation on the platform, described them to me as “the memeification of a person.” The internet-culture reporter Ryan Broderick calls TikTok a “witch hunt machine.”

A representative for TikTok did not acknowledge this well-documented tendency, but told me that the site prohibits harassment, bullying, and “hateful behavior,” while allowing users to filter comments, block accounts in bulk, and make their content private. These are useful features, but they can’t quite address the culture of “dogpiling on random people” (as I’m inclined to call it). One still might ask: How did TikTok get to be the dogpiling-on-random-people app, anyway—and is there any way to make it stop?

TikTok is not designed to be a social network. “It is designed to be an app that gives you entertainment content,” Daniel Klug, a system-science researcher at Carnegie Mellon University, told me. That’s problem No. 1.

The platform’s videos are served with comment sections, which are chaotic and difficult to navigate. Users can also exchange private messages. But these features seem unrelated to the main functions of the app: watching videos; getting your videos to be watched. On account of this structure, experts such as Klug assert that TikTok is not primarily a social space, like Facebook, Instagram, or Twitter, where users define themselves through conversation and visible interaction with their friends, acquaintances, colleagues, and celebrities. TikTok was instead designed for iteration. Its users interact more heavily with their own content, and with the site’s algorithm, than with one another.

This setup has a natural outcome: As soon as content about some specific thing—or some specific person—trends, more content of that type will be produced. TikTok doesn’t want you to comment on someone else’s video. It wants you to make your own version of the same thing. Then your version might worm its way into the algorithmically generated “For You” feeds of other users and find its own success. The fact that TikTok pushes every single video out into these feeds, at least for a test run, means that any user, no matter how obscure, can audition for virality.

That leads to problem No. 2: Once a TikTok video starts to get attention, there are no checks on its spread. This may seem true of all kinds of viral content on any social platform, but there are subtle differences. A viral tweet or Facebook post rarely gains its reach without assistance: Tweets may blow up only after they’ve been retweeted by accounts with big followings, or by tight-knit clusters of accounts (such as those belonging to MAGA Twitter or K-pop fans); Facebook posts may not catch fire until they’ve been shared to big pages or in super-active groups. On TikTok, you don’t need a middleman. You just need to perform well in front of the test audience you’re granted by default. As a result, whenever a potential villain starts to surface, a pile-on can form even faster than it might on other platforms.

If anyone can make a viral hit at any time, the opposite is also true: Even a TikTok star with lots of followers can make a total flop. The platform’s legendary fickleness—and the aura of mystery around its “For You” filtering—creates a third problem. Sitting out a site-wide meme can be costly, so any event that the algorithm seems to be championing becomes too good to miss. The algorithm’s mystery leads users to create “folk theories” about it, Spiro, from the University of Washington, told me. They guess at which colors it likes, which times of day it’s most interested in new videos, and how many seconds it wants a clip to be. Users sense that they are “doing battle with this algorithm, and it’s sort of anthropomorphized,” Spiro said. This battle may feel winnable only when a trend appears. Andrew Downing, a 27-year-old social-media strategist based in New York, told me he felt this way about West Elm Caleb. “I knew that I was hopping on it at the right time,” he told me. “I’ve seen so many trends. So I know when it’s too late. I know when it’s too early.” His Caleb video got about 129,000 views, compared with his normal numbers of less than 1,000.

These three problems, emerging from fundamental aspects of TikTok’s design, contribute to a highly volatile online culture in which it seems like almost anyone can become the target of some bizarre inquest. Paradoxically, the same platform structure also makes it harder to predict the level of distress that a post might cause. “If you’re somebody who has a small following and usually your videos get a couple hundred views, then what is your responsibility to assume that any video can be seen by millions of people?” Abbie Richards asked me. “It’s just, like, a weird new problem that we have to figure out.”

Obviously TikTok is not the first place on the internet where the unchecked actions of a crowd have produced finger-pointing, hostility, and harassment.

Michael Trice, a lecturer at MIT who is interested in the ways that platforms generate “amorality,” told me that the dogpiling on TikTok is a good example of what he calls the “bait and switch” of social media, where users feel as though they’re having one experience on an app but are actually creating a very different one for someone else. A TikTok creator may be focused on the one video they’ve made, and whether it will be popular and well received. The subject of that video, however, may be thrust into an inescapable virality, and become an “unwilling microcelebrity,” as one post merges into hundreds or even thousands more, all of which appear in a very short period of time.

This might just be a feature (or a bug) of life online. Not every negative outcome can be pinned to the features of a specific app, Trice said. When something big is happening on TikTok, there is a lot of incentive to post about it and hope that the algorithm is going to promote your content, but that’s not the only motive at play. “In participatory cultures, when something happens and people want to be a part of it, they find ways that mimic how people have been a part of things in the past,” he told me. On TikTok, there are traditions of callouts, of amateur sleuthing, and of mockery. “People are going to attempt to emulate the forms of communication that they have seen.”

Carrie Orozco, a Las Vegas cocktail waitress who joined TikTok to pass the time in the early months of the pandemic, quickly learned these social mores. She told me she felt totally lost when she came across the Couch Guy drama last fall. So when she saw the first wave of TikToks about West Elm Caleb, she knew what to do: “This Dirty John motherfucker is trying to get his cock-a-doodle-doo into whatever he can in New York City,” she explained in her video. “He is a smooth-talking master manipulator, and I am so grateful that this is going viral across the country for everyone to hear.” Her goal had been to “keep it light,” she told me, and the video got about 700,000 views in two days.

In a twist, just as the West Elm Caleb videos were getting their most attention, TikTok started taking some down. For users like Orozco, it seemed as though the site was disavowing the norms that it had earlier embraced. “TikTok was rewarding the West Elm Caleb hashtag and pushing those videos out,” she said. “And then all of a sudden, it kind of took a turn.” Orozco doesn’t feel bitter about the fact that her video was among the ones removed. She believes it was the right decision, and that TikTok had noticed its users were starting to feel uncomfortable. “Can we do a better job of not derailing people’s lives? Yes,” she told me. But still, Orozco isn’t sure the platform can or should control events like this in the future. “Those kinds of things that we all collectively seem to care about—that is the culture of TikTok.”

Maybe TikTok can do a better job of not derailing people’s lives. Trice told me that, on Twitter, users have become more articulate about harassment techniques since the days of Gamergate, the controversy over video-gaming journalism and an associated trolling campaign that began in 2014. Online communities are also more attuned to the miserable existence of a site’s daily “main character.” “I actually think users have been pretty good on Twitter and Reddit and even on Facebook at sort of self-educating over time,” Trice said. “It’s likely that this form of self-education will come more and more to TikTok.”

An app designed for iteration might even help new ethics of behavior to proliferate. TikTok has normalized a collaborative spirit among its users that encourages people to riff on one another’s work and respond to prompts with innovation. With some time, the same spirit could produce a better remix of the site-wide culture.