Not a Free Speech Issue—Milo Was Banned for Abuse

Last week, for the first time ever Twitter banned a user. Abuse and bullying are common in many online communities, but this time Milo Yiannopoulos went too far for Twitter. Yiannopoulous, an advocate of #GamerGate, the harassment campaign against women in the gaming industry, had been temporarily banned in the past for similar behavior. This time his target was Saturday Night Live cast member Leslie Jones. The abuse started in response to her appearance in the newly released Ghostbusters movie. Once papa-troll Yiannopoulous attacked, all of his loyal minions joined in his campaign. (Warning: Twitter screenshots included in the hyperlinked Jezebel story contain offensive racial slurs.)

During a recent appearance on Late Night with Seth Myers, Jones explained that the abuse itself didn’t worry her; she was used to that. What bothered her was the “injustice of a gang of people jumping against you for such a sick cause.” The amount of anger and aggression Jones received that day was indeed excessive and disturbing. However, this particular instance of abuse signals a broader problem among social media communities.

Most social media platforms don’t provide users with much if any protection against abuse. Anyone who’s experienced online harassment can probably agree that reporting online bullies is usually a waste of time, as they rarely experience any repercussions for their behavior. In the arguments for and against such protections, one side insists that bullying is everywhere and because it’s a form of expression, although not a nice one, it should be the right of people to express themselves however they’d like. Then there are those who believe that bullies should be taught a lesson in manners and communication by banning them from certain communities.

So, should social media communities be able ban people and delete comments that are deemed abusive or violent? Yes. They should. Distinguishing when a comment goes from tasteless nonsense to downright abuse can be tricky, but that discernment should be left to the discretion of the page, and, ultimately, to the platform. And while we should all protect our freedoms of speech and expression, others should be free to interact with communities without being hatefully attacked.

As a secular organization, the American Humanist Association is often seen as radical, ultra liberal, or even a terrorist group. So, as you can imagine, we’re no strangers to daily abuse. While we are normally trolled by evangelical warriors, our new focus on social justice through the Black Humanist Alliance, the LGBTQ Humanist Alliance, and the Feminist Humanist Alliance has opened us up to a new world of abuse, some from within our own secular community. As a social media community manager, I can attest to this. It’s difficult to witness abuse on our page, especially abuse directed at another follower, and do nothing. In other cases, banning harassers is warranted.

Since the Jones incident, Twitter has decided to make changes to their abuse policy. In a statement made last week, Twitter officials explained their decision:

People should be able to express diverse opinions and beliefs on Twitter. But no one deserves to be subjected to targeted abuse online, and our rules prohibit inciting or engaging in the targeted abuse or harassment of others. Over the past 48 hours in particular, we’ve seen an uptick in the number of accounts violating these policies and have taken enforcement actions against these accounts, ranging from warnings that also require the deletion of Tweets violating our policies to permanent suspension.

We know many people believe we have not done enough to curb this type of behavior on Twitter. We agree. We are continuing to invest heavily in improving our tools and enforcement systems to better allow us to identify and take faster action on abuse as it’s happening and prevent repeat offenders. We have been in the process of reviewing our hateful conduct policy to prohibit additional types of abusive behavior and allow more types of reporting, with the goal of reducing the burden on the person being targeted. We’ll provide more details on those changes in the coming weeks.

Sadly, Jones wasn’t the first and she won’t be the last to experience such extreme online abuse. As huge online communities like Twitter and Facebook evolve and grow over the years, their member conduct polices need to reflect the world around them. They need to protect the targeted individuals, not the bigots.