Latest
Recommended
Published: Mon, July 11, 2016
Tech | By Dwayne Harmon

Facebook Explains Live Video Standards After Minnesota

Facebook Explains Live Video Standards After Minnesota

Until this week, the most-buzzed-about Facebook Live videos were a BuzzFeed experiment to blow up a watermelon with rubber bands and a Texas woman's hysterical reaction to a Chewbacca mask. But given the inconsistency with which Facebook already enforces its community standards, and the further leeway it appears to be giving in the case of live video, we have little indication whether we can expect the same in the future.

Facebook and Twitter both have standards that limit what users can post on their sites involving violence.

Low on the priority list, but still important, is that the video also forced Facebook to take more of an active stance on the kind of content it allows on its social network.

The Community Standards is the set of guidelines Facebook uses to determine if content is acceptable for the community at large. In those situations, context and degree are everything. "For instance, if a person witnessed a shooting, and used Facebook Live to raise awareness or find the shooter, we would allow it. However, if someone shared the same video to mock the victim or celebrate the shooting, we would remove the video".

There is no option to report content as "graphic but newsworthy", or any other way to report that content could be disturbing and should be taken down, the report said.

The Castile video had quickly gone viral, yet had also sparked claims that Facebook had reacted to numerous complaints when the video became temporarily unavailable while in the process of streaming, or that the video may have been deleted by police when they took possession of Reynolds' phone. Although it was eventually restored with a warning banner, many questioned whether the site had originally removed the video on goal.

In a recent blog post, Facebook Executive Adam Mosseri shared, "We are not in the business of picking which issues the world should read about".

"The images we've seen this week are graphic and heartbreaking, and they shine a light on the fear that millions of members of our community live with every day". On Facebook, there are over 1.8 billion users and these videos are by default public; at the very least your friends circle will see this "live video". Just as it gives us a window into the best moments in people's lives, it can also let us bear witness to the worst. Facebook and Twitter - corporations with shares traded on the NASDAQ and the New York Stock Exchange - are responsible first and foremost to their shareholders; their ultimate goal is capturing our attention and keeping it so they can show us advertisements.

But we do know the primary ways Facebook flags and takes down content: automated algorithms and human moderators.

Periscope demonstrated that there are people out there willing to commit atrocities just to live stream them. We have a team on-call 24 hours a day, seven days a week, dedicated to responding to these reports immediately. One improvement has been its ability to interrupt a flagged live stream if it violates the company's rules.

The quotes got really interesting when it came to the grounds for which users can report content.

The company noted that violent and graphic images often generate the most controversy and offense.

Her live video put Facebook in the position of delivering crucial information about a politically and emotionally-charged moment, and the company did not quite handle it smoothly. "We've learned a lot over the past few months, and will continue to make improvements to this experience wherever we can".

Like this: