Published: Tue, May 23, 2017
Economy | By Melissa Porter

Leak reveals how Facebook deals with controversial content

Leak reveals how Facebook deals with controversial content

A new report from The Guardian shows how Facebook (FB) is trying to curb the spread of violent, abusive and objectionable posts on its sprawling social media network, which now has nearly two billion users worldwide.

Facebook Live is now a big part of the Facebook experience, but the Wall Street Journal reports that there have been at least 50 acts of violence being broadcast since the feature went live. A series of presentation slides detail how Facebook deals with reports on posts, comments and videos that violate their sitewide rules.

However, generic posts stating someone should die are permitted as they are not regarded as credible threats, the newspaper claims.

You can live stream self-harm, because it "doesn't want to censor or punish people in distress who are attempting suicide".

Speaking to The Guardian, Facebook head of global policy management Monika Bickert said it was hard to reach a consensus of what to permit.

Facebook told The Guardian that while it was using software to intercept some graphic content, "we want people to be able to discuss global and current the context in which a violent image is shared sometimes matters".

Facebook CEO Mark Zuckerberg announced earlier this month that the company was hiring 3,000 more people to help "review the millions of reports we get every week".

"Keeping people on Facebook safe is the most important thing we do".

Statements of intent to commit violence against heads of state, such as "someone shoot Trump", should be deleted. Facebook has brought to the public's attention various cases of police brutality, and while such videos are disturbing, they may be helpful in exposing social problems.

"Generally, imagery of animal abuse can be shared on the site".

Facebook has said in the past that it is in a unique position to do more about the suicide epidemic. A leak of what has been dubbed The Facebook Files gives a fascinating insight into how the company moderates content, shedding light on just what its secret internal guidelines are. For example, statements like "someone shoot Trump" would be eligible for deletion; "let's beat up fat kids" would not.

Another example is in relation to violent language, which Facebook only deems as against the rules if the specificity of language makes it seem like it's "no longer simply an expression of emotion but a transition to a plot or design". However, this content would be deleted once there was "no longer an opportunity to help the person". Moderation guidelines for other countries will be required to follow local laws. Videos of violent deaths are also allowed on the premise that it should help inform or raise awareness regarding a certain issue. Handmade art showing nudity is fine but if your art is digital and contains nudity your content might be removed.

However, the leaked documents also showed the company are taking measures to improve policies, even if it they are only implemented following public pressure.

Like this: