The Many Ways Twitter Is Bad at Responding to Abuse

Contrast this to the procedure for reporting spam. To report spam, a user must click a button that says “This account is spam.”

That’s it. Twitter is oddly unconcerned about false or unauthorized reports of spam: There are no questions about the user’s involvement with the alleged spam, no requirement to provide links or explain how the content qualifies as spam, no requirement of a signature, no need to fear retaliation from the reported spammer.

As Amanda Hess wrote recently in Slate, Twitter essentially tells users who are being harassed “to shut up.” Twitter sagely advises users who are being harassed that “abusive users often lose interest once they realize that you will not respond.” Forcing victims to adjust their behavior is rarely the right response to acts of willful abuse by others. Indeed, Twitter seems to believe that abuse on its service primarily involves junior high schoolers sniping at each other. The service recommends:

If the user in question is a friend, try addressing the issue offline. If you have had a misunderstanding, it may be possible to clear the matter up face to face or with the help of a trusted individual.

This simplistic perception of abuse clearly underlies Twitter’s belief in the effectiveness of the block function. Blocking means that a person being harassed will no longer see abusers’ tweets in their mentions. That is, it does not actually prevent an abuser from tweeting at his target or from the abuse being visible from anyone else; it just means the target can’t see it anymore. This is the equivalent of responding to someone yelling in your face as you walk down the street by putting on a blindfold and earplugs.

Twitter demonstrates a fundamental lack of understanding of the dynamics of social media abuse by touting block and mute functions as adequate measures to respond to abuse. People who use social media to abuse other people do so at least in part because of its public dimension: They want not only to force themselves into their targets’ line of vision, but to ensure that other people see them doing it. That’s part of the harasser’s game—not merely to attack an individual, but to attack, discredit, and humiliate a target in front of as large an audience as possible.

In her powerful article, “Why Women Aren’t Welcome on the Internet,” Amanda Hess detailed how online harassment is disproportionately targeted at women, and that:

this type of gendered harassment—and the sheer volume of it—has severe implications for women’s status on the Internet. Threats of rape, death, and stalking can overpower our emotional bandwidth, take up our time, and cost us money through legal fees, online protection services, and missed wages.

The impact of online harassment is not limited to the victims. Like sexual harassment in workplaces, schools, and on the street, this abuse can drive women out of public spaces and inhibit their contributions to public discourse. All society suffers from the loss of women and girls’ voices in professional, creative, and social life. Harassment directed at racial and sexual minorities has similar effects, depriving social spaces of the diversity and innovation those groups might offer. 

This isn’t just a Twitter problem, of course. Gawker Media, Facebook and Google have also come under fire for inadequate responses to abuse. The question on many people’s minds is why social media superpowers cannot—or will not—design their platforms to optimize creativity and exchange instead of being swallowed up by the dark noise of abusers and trolls. After all, we have seen what their power looks like when they choose to exercise it. After learning that “mug shot websites” were extorting money from individuals facing financial, professional, and personal ruin as a result of the prominent display of their arrest records in search engine results, Google changed its search algorithm to push the results down. Major credit card companies responded as well, terminating the accounts of the mug shot sites so that they could not receive payments.

Twitter might now devise structural changes to its platform to help facilitate meaningful interaction while discouraging mob mentality. It shouldn’t have taken a public attack on a beloved celebrity’s memory and family to rouse any of these powerful companies from their slumber, but if this is the proverbial straw, let us hope that they will respond with wise, thoughtful, and lasting corrections to the architecture of their platforms. As Lessig wrote in 2000:

“Our choice is not between ‘regulation’ and "no regulation." The code regulates. It implements values, or not. It enables freedoms, or disables them. It protects privacy, or promotes monitoring. People choose how the code does these things. People write the code. Thus the choice is not whether people will decide how cyberspace regulates. People—coders—will. The only choice is whether we collectively will have a role in their choice—and thus in determining how these values regulate—or whether collectively we will allow the coders to select our values for us.

All of the major tech companies claim to be built around the interests of “users” to promote robust and diverse interaction. The question, then, is who counts as “users.” By rewarding abusers and isolating targets of abuse, these platforms are driving away creative, valuable users in favor of malicious, repressive users. This isn’t just a heartless response; it will likely turn out to be a self-defeating one.

Jump to comments
Presented by

Mary Anne Franks is an associate professor at the University of Miami School of Law and the vice-president of the Cyber Civil Rights Initiative.

Get Today's Top Stories in Your Inbox (preview)

CrossFit Versus Yoga: Choose a Side

How a workout becomes a social identity


Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

CrossFit Versus Yoga: Choose a Side

How a workout becomes a social identity

Video

Is Technology Making Us Better Storytellers?

The minds behind House of Cards and The Moth weigh in.

Video

A Short Film That Skewers Hollywood

A studio executive concocts an animated blockbuster. Who cares about the story?

Video

In Online Dating, Everyone's a Little Bit Racist

The co-founder of OKCupid shares findings from his analysis of millions of users' data.

Video

What Is a Sandwich?

We're overthinking sandwiches, so you don't have to.

Video

Let's Talk About Not Smoking

Why does smoking maintain its allure? James Hamblin seeks the wisdom of a cool person.

Writers

Up
Down

More in Technology

Just In