Reuters / Leah Millis

When Alex, now a high-school senior, saw an Instagram account he followed post about something called QAnon back in 2017, he’d never heard of the viral conspiracy theory before. But the post piqued his interest, and he wanted to know more. So he did what your average teenager would do: He followed several accounts related to it on Instagram, searched for information on YouTube, and read up on it on forums.

A year and a half later, Alex, who asked to use a pseudonym, runs his own Gen Z–focused QAnon Instagram account, through which he educates his generation about the secret plot by the “deep state” to take down Donald Trump. “I was just noticing a lack in younger people being interested in QAnon, so I figured I would put it out there that there was at least one young person in the movement,” he told me via Instagram direct message. He hopes to “expose the truth about everything corrupt governments and organizations have lied about.” Among those truths: that certain cosmetics and foods contain aborted fetal cells, that the recent Ethiopian Airlines crash was a hoax, and that the Christchurch, New Zealand, mosque shootings were staged.

Instagram is teeming with these conspiracy theories, viral misinformation, and extremist memes, all daisy-chained together via a network of accounts with incredible algorithmic reach and millions of collective followers—many of whom, like Alex, are very young. These accounts intersperse TikTok videos and nostalgia memes with anti-vaccination rhetoric, conspiracy theories about George Soros and the Clinton family, and jokes about killing women, Jews, Muslims, and liberals.

Recent posts by @the.new.federation, which has more than 38,000 followers, include a post likening Representative Maxine Waters to an ape, one that labels an image depicting prison rape as “how socialism works,” and several suggesting that Ruth Bader Ginsburg died months ago and is “better off dead.” Yesterday, it asked, “If Muslims can behead Christians, why can’t we do the same to them?” That post has 2,088 likes.

A post from @unclesamsmisguidedchildren, which has more than 559,000 followers, implies that John Podesta is partially responsible for the New Zealand shooting. That post has more than 8,300 likes. A post made four days ago includes a video promoting the conspiracy that more than 22 Islamic terror camps operate in the United States and are likely responsible for the shooting in Parkland, Florida. It has been viewed more than 200,500 times.

In an email, an Instagram spokesperson told me that the company and its parent, Facebook, “continue to study trends in organized hate and hate speech and work with partners to better understand hate organizations as they evolve.” The spokesperson added, “We ban these organizations and individuals from Instagram and also remove all praise and support when we become aware of it. We will continue to review content, accounts, and people that violate our policies and take action against hate speech and hate organizations to help keep our community safe.”

Since 2016, social-media companies have come under fire for allowing white supremacy and other extremist ideologies to spread. YouTube’s algorithms have been shown to push people further toward the fringes; a New York Times headline called the site “The Great Radicalizer.” Facebook is notorious for allowing anti-vaxxers and other conspiracy theorists to organize and spread their messages to millions—the two most-shared news stories on Facebook in 2019 so far are both false. Twitter, too, has been criticized for being slow to police the misinformation that spreads on its platform.

But Facebook, Twitter, and YouTube are not where young people go to socialize. Instagram is.

Part of a video shared to @the_typical_liberal’s Instagram page jokes about a great “meme war.”

The platform is likely where the next great battle against misinformation will be fought, and yet it has largely escaped scrutiny. Part of this is due to its reputation among older users, who generally use it to post personal photos, follow aspirational accounts, and keep in touch with friends. Many teenagers, however, use the platform differently—not only to connect with friends, but to explore their identity, and often to consume information about current events.

Jack, a 16-year-old who asked to be referred to by a pseudonym to protect his identity, has learned a lot about politics through Instagram. In 2020, he’ll be able to vote for the first time, and so he recently started following some new Instagram pages to bone up on issues facing the country. “I try to follow both sides just to see what everyone’s thinking,” he said. While he’s struggled to find many compelling pages on the left, he said he’s learned a lot from following large conservative Instagram meme pages such as @dc_draino and @the_typical_liberal, which has nearly 1 million followers and claims to be “saving GenZ one meme at a time.” Recent posts include a joke about running over protesters in the street, an Infowars video posted to IGTV, and a meme about feminists being ugly. “It’s important to have The Typical Liberal and DC Draino to expose the [media’s] lies, so we can formulate our own opinions,” Jack told me.

@the_typical_liberal’s account is set to private, so you have to request to join.

Following just a handful of these accounts can quickly send users spiraling down a path toward even more extremist views and conspiracies, guided by Instagram’s own recommendation algorithm. On March 17, I clicked Follow on @the_typical_liberal. My account lit up with follow requests from pages with handles alluding to QAnon, and the app immediately prompted me to follow far-right figures such as Milo Yiannopoulos, Laura Loomer, Alex Jones, and Candace Owens, as well as a slew of far-right meme pages such as @unclesamsmisguidedchildren and @the.new.federation. Following these pages resulted in suggestions for pages dedicated to promoting QAnon, chemtrails, Pizzagate, and anti-vaccination rhetoric.

@q_redpillworld17, for instance, which requested to follow me after I followed @the_typical_liberal, has posted several videos and images claiming proof that the New Zealand shooting was a “false flag”; one post compares the mosque’s blood-spattered carpet with another image, implying that the carpets don’t match so the shooting was staged. Another is a graphic video of the shooting, with a caption claiming that the bullets disappeared mid-air. Another suggests 200 examples of proof that the Earth is flat. Another falsely claims that Twitter CEO Jack Dorsey is secretly connected to the Clintons, who feed baby blood to George Soros.

@activate_justice, another page that requested to follow my account, is littered with Kek memes and shared a screenshot of a YouTube video declaring that it has been “confirmed: Hillary died” and that Nancy Pelosi has been arrested. @mommy_underground, which Instagram itself suggested I follow, features a post falsely claiming that a new bill would “engrave Planned Parenthood’s abortion number” onto the back of all student-ID cards for girls over the age of 12.

While Alex Jones has been banned from Facebook, Twitter, and YouTube, his videos are thriving on Instagram, where they’re often reposted by meme accounts.

@the_grim_inquisitor, an extremist meme page that Instagram also suggested I follow, posted multiple videos from the Christchurch shooting, including one that shows people’s bodies being shot, and that claims the attack was a “false flag” and that the people being killed are actors. Another meme on the page brags that the administrator was “antisemetic before it was cool.” The caption of another post claims that “vaccines make us all sick and your kids autistic. They are spraying the sky with aluminum and barium to block the sun/nano dust ingestion. Research this for yourself and get the word out or you are part of the problem.”

By Monday, there were five videos of the Christchurch attack posted by meme pages in my feed. Four of them are still up, and on Tuesday, another was surfaced at the top of my feed. The captions on all the videos question the validity of the attack and claim that it was a false flag carried out by the U.S. government.

The top of my Instagram Explore page also featured a racist caricature of Alexandria Ocasio-Cortez, exaggerating her features and darkening her skin; a post about Hillary Clinton being a pedophile; a 4chan screenshot talking about a “beta Jew”; and yet another Christchurch false-flag post. My Explore page was littered with posts containing hashtags such as #PedoVore, #TheGreatAwakening, #WWG1WGA, #QAnon, #Spygate, #Pizzagate, and #TheStorm.

Given the velocity of the recommendation algorithm, the power of hashtagging, and the nature of the posts, it’s easy to see how Instagram can serve as an entry point into the internet’s darkest corners. Instagram “memes pages and humor is a really effective way to introduce people to extremist content,” says Becca Lewis, a doctoral student at Stanford and a research affiliate at the Data and Society Research Institute. “It’s easy, on Instagram, to attach certain hashtags to certain memes and get high visibility.”

Indeed, 344,000 Instagram posts currently include the hashtag #QAnon; 262,00 include the hashtag #WWG1WGA, a QAnon conspiracy phrase; 166,000 include the hashtag #Pizzagate. As of Tuesday afternoon, three of the top 12 Instagram posts featuring the hashtag #vaccines were promoting anti-vaccine messages—after Facebook announced last week that it would diminish the reach of anti-vaccine information on Facebook and Instagram. (Notably, these numbers don’t capture thousands of posts from private accounts. Many of Instagram’s biggest far-right meme accounts are private—a well-known tactic for fueling growth, but also a way to avoid scrutiny from outsiders and to prevent being reported.)

Here’s an example of how Instagram’s follow recommendations can end up recommending more extreme content. The moment you follow one anti-vaccination page, you’re prompted to follow more.

In December, Wired reported that Instagram had become the “go-to” social network for the Internet Research Agency, a Russian troll farm notorious for meddling in U.S. elections. A report commissioned by the Senate Intelligence Committee declared that “Instagram was perhaps the most effective platform for the Internet Research Agency” to spread misinformation. “Instagram has the power of Twitter to broadcast out, but the infrastructure of Facebook supporting it,” says Jonathan Albright, a researcher at Columbia University who directs a center on digital forensics. “It has the best of all platforms.”

And its mechanisms are more inscrutable. Last year, the company restricted API access—the service that processes requests for Facebook data from remote applications—following several Facebook data-breach scandals. According to Albright, this has stunted research efforts focused on the spread of misinformation and extremism. Just last fall, Albright’s research revealed that anti-Semitism on the platform was rising. He says he would be unable to carry out similar research today due to the recent API restrictions. “The ability for me to do a network analysis or look at how accounts are connected has basically gone away,” he says.

Meme pages aren’t the only types of accounts bent on leveraging Instagram to radicalize young people. As The Daily Beast reported last fall, Instagram has become a haven for far-right figures such as Alex Jones, who has found a home on Instagram. Jones’s Infowars videos have flourished on IGTV, Instagram’s home for more long-form video content, where networks of accounts repost them.

Far-right figures such as Candace Owens, Lauren Southern, and Brittany Pettibone are all also active on the platform, where they share personal photos and chat with fans, much in the same way celebrities and fashion bloggers might. “Far-right influencers are adopting Instagram-influencer strategies to normalize themselves,” Lewis says. Members of the American Identity Movement, also known as Identity Evropa, a white-supremacist group, are also active on the platform and use it to post relatable content. These posts do not technically violate Instagram’s terms of service, but Lewis says these groups are using the platform “to rebrand themselves as less violent than they actually are” and to attract young users to their extremist movements.

Extremist meme pages intersperse jokes with coded images referencing conspiracies and figures such as Pepe the Frog.

CJ Pearson, a 16-year-old conservative activist who is not affiliated with any extremist groups, says that “the role [Instagram] will play in 2020 is being slept on right now. The right has an advantage in that they’ve organized a huge network of meme accounts on the platform that reach millions of young people across the web who will be casting their first vote come 2020.”

And Pearson says he wouldn’t be surprised if a lot of those kids are already familiar with conspiracies such as QAnon. “There’s mainstream Insta pages that believe in QAnon. There are lots of creators who openly promote it and believe in it,” he says.

Nick, a 16-year-old who asked to use a pseudonym, said that he’s seen such content himself on the platform. He began following @the_typical_liberal after the page was suggested to him because so many of his friends from school follow it. “I noticed this whole network of personalities and pages because they all shout each other out,” he said.

But recently, content from those pages has gotten too extremist for him. Nick was particularly surprised when @the_typical_liberal began reposting Infowars videos. He stopped trusting it as a legitimate news source, but noted that 30 of his friends still follow the account, and said that many people he knows are “really into it.”

“I’ve seen talking points from these pages regurgitated up in class debates,” Nick said. “I know where they’re getting it.”

We want to hear what you think about this article. Submit a letter to the editor or write to letters@theatlantic.com.