Latest
Recommended
Published: Thu, April 26, 2018
Research | By Jennifer Evans

Facebook finally makes it clear what will get you banned

Facebook finally makes it clear what will get you banned

"Reducing the spread of false news on Facebook is a responsibility that we take seriously".

The guidelines encompass dozens of topics including hate speech, violent imagery, misrepresentation, terrorist propaganda, and disinformation.

The guidelines are global and will be released in 40 different languages.

"You should, when you come to Facebook, understand where we draw these lines and what is okay and what's not okay", Facebook's vice president of product policy and counter-terrorism Monika Bickert, a former U.S. federal prosecutor, told reporters on Tuesday. In 2016, Facebook removed a Pulitzer-prize winning photo of a naked girl fleeing a napalm attack during a 1972 strike in Vietnam.

Reading through the guidelines gives you an idea of how hard the jobs of Facebook moderators must be. The company has faced backlash in the past over how hard it is to get in touch with Facebook to explain to them that a takedown was perhaps a little harsh. Bickert said discussions about sharing the guidelines started last fall and were not related to the Cambridge controversy. This doesn't include certain content around state-sponsored violence, or pro-political agenda, which would be removed under other policies, such as graphic violence policy. They have ballooned from a single page in 2008 to 27 pages today.

Facebook announced the publication of their internal documents used by their moderators as well as expanding on a way to appeal decisions made on banned posts.

Facebook representatives declined to respond to Ars' request for comment on the record, insisting that we speak to them only on background.

The content policy team at Facebook is responsible for developing our Community Standards. The process for appeals, however, can sometimes take a very long time, if at all.

Another big change in Facebook's user policy is that users can now approach the company if they feel that content has been removed unfairly. You can't post any form of official document that might reveal another person's identity e.g. driving licence, bank statement etc.

"At the time they told us they could not do it, they would not do it, and actually stopped engaging at that point", Cyril said. They can even get around ...

She added: "Everybody should expect that these [policies] will be updated frequently". It boasts of a wide viewership online and has around 2.2 million users online. The social network doesn't want to jump the gun on posts that are not necessarily advocating for real harm. Additional posts are also flagged by Facebook's automated systems. If Facebook agrees that a mistake has been made, the post will be restored. A Facebook executive said the teams were working on building more tools.

While Facebook aspires to apply its policies "consistently and fairly to a community that transcends regions, cultures and languages", it frequently fails to do so.

It's all part of Facebook's attempt to better control - and do so more transparently - what's going on across the site, particularly in the wake of controversies around its involvement in the 2016 U.S. presidential election.

Yesterday, the site shared their six things that can get you banned from Facebook. 99% of such Facebook ISIS and al-Qaeda content was not user reported.

Facebook also removes all child pornography, pornography, and all nude images in general. And Facebook isn't overly keen on content related to terrorism, human trafficking, or organized violence either. A policy team meets every two weeks to review potential additions or edits. They are required to list the organizations outside Facebook with which they consulted.

Like this: