The Secret Internet of TERFs

A series of browser tabs, with flames in the center, on a pink background
Shutterstock / The Atlantic

Updated at 1:56 p.m. ET on Dec. 8, 2020.

Mary Kate Fain, a 27-year-old engineer and writer living in Houston, has always considered herself a feminist. Growing up, she told me, she had a pretty standard set of progressive values—her primary focus was animal rights, and her feminism was reflexive, mainstream. In college, however, her ideas about feminism shifted. After volunteering at a domestic-violence shelter and experiencing an abusive relationship herself, she committed to some of the radical feminist ideology most often affiliated with the second-wave icon Andrea Dworkin, which is focused on the roots and prevalence of male violence. Eventually, her beliefs radicalized further: She became convinced that trans women are men and trans-rights activism is just another weapon of the patriarchy.*

Like many, Fain’s political transformation was helped along by the internet. Though she’d never had much use for social media before, on Reddit she found a forum—or “subreddit”—where tens of thousands of members, predominantly women, were devoted to the insistence that trans women are not women. “I first found the community while I was still looking for answers,” Fain said. These women were asking the same questions that she was, going through the same uncomfortable situations with their friends, feeling the same moment of disenchantment. They had experienced the same guilt over breaking with their communities, and now they had one another.

Recommended Reading

Among other online feminists, the common name for this group Fain found is “trans-exclusionary radical feminists,” or TERFs. The name the community has chosen for itself is the somewhat more palatable “gender critical,” though, as other feminists often point out, that name means nothing; all feminism is critical of gender. TERFs constitute “a minority of a minority of feminists,” says Grace Lavery, a UC Berkeley literature professor and writer. Nevertheless, this tiny group has attracted a disproportionate amount of attention in the past several years, in large part thanks to social-media platforms. Anti-trans feminists have a presence in many mainstream online spaces, including Twitter, “radfem” Tumblr, the Black women’s beauty forum Lipstick Alley, and the British parenting forum Mumsnet.

On these sites and others, they use many of the same trolling tactics as other internet-based fringe political movements to disrupt conversation, skew reality, and make the internet another dangerous place for trans women through doxing and harassment. Anti-trans activists have used social media to call out specific trans women who use women’s bathrooms, for instance, labeling them “predators” and “pedophiles,” and promising to resist them by any means necessary—be it pepper spray or pistol. GLAAD has shown that these sorts of attacks have warped online discourse, turning focus away from discrimination and instead encouraging renewed debate about trans women’s bodies. (Fain insists that her views are “utterly mainstream” and “commonsense.” She denied that members of her community engage in doxing, harassment, or threats.)

For years, r/GenderCritical, the group Fain joined, was the internet’s largest and most recognizable anti-trans space, known on Reddit as a “major pipeline” into TERF ideology. That abruptly changed in June, however, when r/GenderCritical disappeared from Reddit. The cataclysmic events of 2020 had pushed all major social-media platforms into content-moderation crisis mode—compelling them to adopt a new dedication to removing misinformation and hate speech, adding friction to prevent harassment and viral conspiracy theories. Reddit responded to pressure from its users in the midst of the Black Lives Matter protests by introducing an overhauled content policy that contained specific rules about hate speech. Its implementation resulted in an automatic ban for r/GenderCritical. No warning. (The notorious pro-Trump forum r/The_Donald was removed the same day, along with about 2,000 other forums.)  

Fain framed the ban flatly as persecution. “They use the label hate speech to silence speech they don’t want,” she told me. “Radical feminism does not come from a place of hate, nor anything even remotely near it. Radical feminism comes from a place of love for women and girls.” Almost immediately, she joined a core group of r/GenderCritical members in an effort to rebuild what they lost. In about a month, they came up with Ovarit, a new, invite-only Reddit-inspired platform. They’ve transferred over archived threads that were preserved before the ban and started inviting women one by one to a more secluded space. Freed from the constraints of a major platform and unwanted attention from a broader internet public, the site was built not just as a safe space to “protect” themselves and “carry on as before,” one r/GenderCritical moderator wrote after the migration, but to become even bigger.

TERFs are far from the only banned communities that have taken matters into their own hands in this way. For years, the conversation about online moderation has been about pressing major social-media companies to take responsibility for what happens on their platforms. But now that these companies are finally doing so, reactionary alternative platforms such as Ovarit are popping up like mushrooms. Many of the exiled groups behind them have little in the way of shared ideology or politics, but they do share a fixation on the way they’ve been persecuted. And they raise a whole new set of questions about how to break down the internet’s structural penchant for hate.


The phrase online echo chambers generally refers to self-created silos on websites that are enormous. On Facebook, you can find yourself in a right-wing or left-wing bubble, but the other side is there, engaging with the same algorithmically accelerated trends, occasionally getting fired up enough to jump into a fruitless debate. Now, though, there are early signs that the bubbles are moving even further apart. Pundits and politicians on the right have been threatening to migrate en masse away from the Big Tech platforms they view as censorial, and set up shop on a “free speech” site such as Parler or Gab. Activists on the left, who have their own disdain for Big Tech, have long been at the forefront of the push for decentralized social networks such as Mastodon. Meanwhile, getting banned from a social-media platform and creating a knockoff of it is effectively a rite of passage for toxic groups at this point.

All kinds of these groups have created their own independent havens. When Reddit started moderating r/The_Donald, which had nearly 800,000 members at the time it was banned, the community created TheDonald.Win as a home for its racist memes and indecipherable blend of “irony” and hatred. The notoriously violent incel community was also banned and moved on to a hate site of its own. The men’s rights activists in r/TheRedPill weren’t banned—only quarantined, which means the group doesn’t show up in Reddit search—but they made a new site as well. Fain, who’s now an icon in the online TERF community, has made a whole constellation of radical-feminist platforms. She created the blogging platform 4W after she was banned from Medium. She created Spinster.xyz, which she says has about 14,000 users, as an alternative to Twitter, “in response to the many radical feminists who were being silenced or banned.”

To build Ovarit, Fain organized with the former moderators of r/GenderCritical and a handful of other collaborators in a Discord server. Making a new website from scratch would take too long, so they looked for a preexisting platform with open-source code. The team thought about using the open-source software behind Saidit, a popular Reddit alternative that hosts many banned Reddit communities—including the QAnon subreddit r/Pedogate and the snuff-film subreddit r/WatchPeopleDie—but it wasn’t secure enough.** (The group was worried about cyber attacks.) Another platform, Raddle, didn’t offer moderation tools, which would be important if outsiders ever came to Ovarit to cause trouble. They thought about Lemmy, a federated alternative to Reddit, which hosts the also-banned Reddit community dedicated to the popular left-wing podcast Chapo Trap House. Fain says that didn’t work out because the developers of Lemmy are “actively anti-feminist,” while the developers told me their code of conduct “contains a section against anti-trans bigotry, [which] means we wouldn’t help them in any way.”

Finally, Fain and the others settled on an open-source platform called Throat, run by the Argentinian developer Ramiro Bou. Throat was created in 2016 as an alternative to Voat, another Reddit alternative, which was hosting many of the most disgusting former subreddits and had already become unusably toxic—as might be expected of any site branded as a home for conversation too disgusting for 2015 Reddit. When I asked Bou about Ovarit’s use of his code, he told me, “They’re nice people,” and that they’re currently one of the most active communities on Throat.

So r/GenderCritical set up shop on a new instance of an alternative to an alternative to Reddit. Ovarit looks exactly like Reddit, except it’s purple, and subreddits are called “circles.” There is a circle called “Cancelled,” which is specifically for talking about “attempts … to silence those who speak out against the queercult.” There is a circle called “TransLogic,” which is specifically for talking about “misogynistic and illogical things trans activists say and do.” There are general-interest circles for talking about books, television, science, and knitting. There is a circle called “Radfemmery,” which is for memes and jokes about how much the people in it do not like trans women.

The tone of the discussions in most of the circles is insular and defensive. Much of it is about the way Big Tech is censoring radical-feminist thought by driving “wombyn”—a deliberately exclusionary term that prizes women with female reproductive organs—off of their platforms, as well as the way the mainstream media has been taken over by a “tiny minority of men,” which is how Ovarit’s members refer to trans women. The plight of J. K. Rowling is revisited often.

In a practice carried over from Reddit, members are encouraged to share their conversion stories, which they confusingly call their “peak trans” moments. In a typical exchange, one woman explains that she came to Ovarit after dragging herself out of a trans-rights-oriented Tumblr community and falling down a YouTube rabbit hole; another replies that her story is extremely similar, right down to her discomfort with her previous social circle’s expectation that she be supportive of “men in lipstick.” (Many of these stories are told “with a sense of excitement, guilt, fear … it’s disturbing but thrilling,” Lavery, the UC Berkeley professor, told me. “All the usual stuff that people who get involved in extremist groups find.”) The users joke and bicker, like all political groups, and then they come back together—bonded by their shared experience of being unwelcome most anywhere else.

So far, the only major difference between Ovarit and r/GenderCritical is that here, nobody challenges the members. There are no outsider “trolls” butting into the conversation to tell them that they’re wrong. On Reddit, some women were uncomfortable being totally candid, Fain told me. But here, they can be themselves. “It was really hard to be on Reddit as a woman,” she said. “Now on Ovarit … It’s a big breath of fresh air.”


In its early years, Reddit was known as a platform for free-speech absolutism. (As was Twitter, which called itself “the free-speech wing of the free-speech party.”) But as the site got larger, it was pushed by its own users to realize what that really means. Completely unimpeded speech for some—those who want to express hate—inherently limits speech for others. Most hate speech is protected by the First Amendment, but that obligation doesn’t apply to social-media companies. Getting banned from Reddit is not a legal consequence of speaking; it’s a social one.

The goal of online moderation can be thought of in two ways. The point can be to reduce hate speech or extremism on Reddit, Facebook, or any other specific platform, which has gotten much easier to argue for in terms of business interests and consumer preference. Or it can be to limit the spread of these things across the internet more broadly, which is a much more abstract project. As radical communities multiply on the outskirts of the internet, whose responsibility is it to worry about them?

There is a case to be made that these communities should not be kicked off major sites in the first place. If you remove a group like r/GenderCritical from Reddit, that group will move on to a more lawless part of the web. The escalation of rhetoric there isn’t slowed by any platform rules, and it also isn’t hemmed in by any dissenting voices, says Luc Cousineau, an internet researcher at the University of Waterloo. There is not even the slightest pressure to dial back hateful speech in order to seem well intentioned and approachable. Keeping everyone near one another might come with a sort of “social content moderation,” Cousineau suggests. He hasn’t researched whether this works, but it’s an important point: You can’t de-radicalize anyone if they’re off in their own world.  

Lavery is sympathetic to this reasoning, but she emphasizes that keeping TERFs in close proximity to trans women comes with a severe price: continued harassment, bad-faith attacks, and implied or explicit threats of violence. Lavery has spent years observing the way anti-trans activists target and terrorize trans women, including herself. “Trans people deserve to be online,” she told me. This is often unbelievably difficult. The journalist Katelyn Burns wrote last year about how the internet is weaponized against trans people. She had personally been doxed, been harassed on Twitter, and watched members of r/GenderCritical dig up and mock pictures taken of her with her children before her transition.

So, the banning approach. The research about what happens when toxic groups are removed from Reddit is limited, but encouraging. Hate speech across the site went down after a purge of such communities in 2015, which made the site more usable for a more diverse group of people. A recent study of the new off-Reddit platforms for r/The_Donald and r/Incels found that the number of people who use those sites is substantially smaller than the number of people who participated in their respective subreddits, and growth is much slower. Without Reddit, these extremists struggle to recruit.

Still, there is reason to be concerned about what happens to the extremists who already exist when a group is banned. The collective identity of r/The_Donald was built around a shared fixation on external threats—mostly Muslims and leftists—according to a recent study. The “us versus them” attitude was requisite for feeling a sense of belonging there. After r/The_Donald was banned from Reddit, another group of researchers looked at how this attitude fared on their new site, TheDonald.Win, and found that it was exacerbated. The team traced the usage of words such as we and they, as well as instances of keywords they had added to a “fixation dictionary,” which could be used to approximate the toxicity of a community. “There are substantial increases in how toxic they became once they left the platform,” Manoel Ribeiro, a researcher at the Swiss Federal Institute of Technology and one of the study’s authors, told me. “There seems to be a trade-off.”

On Ovarit, us-versus-them language is everywhere. “They’ve been working for years now to censor and steer conversation on social media,” one recent comment read, underneath a post that warned “Be careful as they’re setting traps for us.” (The “they” refers to outsiders back on Reddit.) Ovarit may remain small, but the users who stay are spending more and more time talking about how they’ve been oppressed by popular opinion on trans women’s right to exist. As they spread out on the new site, there are entire discussion spaces reserved for these feelings. Many of the top-voted posts on the home page are screenshots cherry-picked from the outside world for ridicule or disdain. (“They’re self-aware lmao,” one user wrote recently above a screenshot of a playful meme ostensibly made by a trans woman about the transition experience.)  

If you spent hours a day on this site, it would be easy to forget what that broader world is really like. It would be easy to forget what other people are really like, too, and to lose any curiosity about what they experience.

After we spoke, I sent Fain a link to a thread on Ovarit, in which women were discussing their disdain for Transgender Day of Remembrance, an annual observance dedicated to the memory of people who were killed directly by anti-trans violence. In 2020, the number of deaths is at least 40 so far. “How many fucking invented holidays do they have at this point?” one asked. “They should change it to Every Day is a Trans Day because they don’t let us stop reading or hearing about them for even a minute,” wrote another.

I asked Fain if this kind of mocking, angry speech was concerning to her at all, and she wrote back to say no. “As I’m not a [moderator] on Ovarit, I don’t feel I’m in the best position to comment on specific content,” she said. “More generally, though, I think humor and anger are both very common ways for people to deal with pain and oppression.”


* This article has been updated to better reflect Fain’s beliefs.
** This article previously misstated that Fain and her team considered moving to Saidit. In fact, they considered using the open-source software behind Saidit.