Latest
Recommended
Published: Wed, December 06, 2017
Economy | By Melissa Porter

YouTube to hire more screeners to seek out offensive video

YouTube to hire more screeners to seek out offensive video

Google is also taking a hard-line on the comments section of YouTube in order to root out abusive and hateful comments left by everyone from cowardly cretins spewing bile from the behind the safety of their computers, to risky sexual predators attempting to use YouTube as means to manipulate people. It's already used to remove violent extremist videos. Adidas has said the situation is "completely unacceptable" while Mars, along with other companies, has pulled advertising until safeguards are in place.

In an effort to tackle the issue, YouTube has developed software to identify videos linked to extremism. It's unclear, however, how the expansion of moderators announced on Monday might impact this kind of content, since YouTube said it was focused on hate speech and child safety. Some uploaders have raised concerns about the software flagging legitimate content. Posters whose videos are flagged by the software may be ineligible for generating ad revenue. "Since June, our trust and safety teams have manually reviewed almost 2 million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future", wrote Susan Wojcicki.

She said adding more people to identify inappropriate content will provide more data to supply and potentially improve its machine learning software.

"It's important we get this right for both advertisers and creators, and over the next few weeks, we'll be speaking with both to hone this approach". "We've heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we don't demonetize videos by mistake".

A Times investigation last month revealed that Youtube videos showing "scantily clad children" had attracted comments from hundreds of paedophiles, in some cases encouraging them to perform sex acts.

The YouTube CEO also stated that the company had developed a "computer-learning" technology capable of weeding out radical content on the platform, where hundreds of minutes of videos are uploaded each minute. To date, 98% of all removed videos have been flagged by machine learning algorithms, and 70% of violent extremist videos are taken down within eight hours of upload.

Like this: