Latest
Recommended
Published: Sat, November 11, 2017
Research | By Jennifer Evans

YouTube is fighting a wave of disturbing videos aimed at kids

YouTube is fighting a wave of disturbing videos aimed at kids

It said that it was introducing a feature that would stop the videos being showed in YouTube kids if they are being flagged.

Back in August, the company rolled out a new policy restricting users from advertising dollars for the inappropriate use of family-friendly characters, such as Elsa and Spider-Man.

But one search for Peppa Pig on YouTube turned up a video called "Cocaine pancakes" with almost 1 million views.

In tandem with YouTube's policy-review team, YouTube Kids relies in part on parents to identify content that shouldn't be in front of children. Users can also create a profile for each kid and choose between younger or older content levels to manage the types of videos they can watch. "But no system is flawless", Balaji Srinivasan, YouTube Kids engineering director, wrote in a blog post last week.

YouTube is taking care of it for now: they're rolling out a new process that allows parents more control and information about what content their kids can access. They appear to be part of the lucrative online world of ad revenue farming, but many parents have criticized YouTube for failing to keep these videos off of the YouTube Kids app...the app that is specifically meant to protect kids from potentially troublesome videos.

Following revelations that YouTube is serving up to kids thousands of inappropriate and disturbing videos, the company said it will step up its efforts to prevent children from seeing such content.

They included Peppa Pig in frightening situations, such as drinking bleach or getting teeth pulled out at the dentist.

The YouTube Kids app is now available in 37 countries and has more than 11 million weekly active viewers, according to the Google-owned video platform. After that, there is a team of humans that review videos which have been flagged.

The question of what defines "inappropriate" content is still an open one. In the past 30 days, the platform said, only.005% of videos that made it past automated filters were then flagged by users. But the company did not detail what exactly the line would be, nor provide examples of previously unrestricted videos which would now be flagged.

But as Bridle and others noted, defining what, precisely, renders a video such as "BURIED ALIVE Outdoor Playground Finger Family Song Nursery Rhymes Animation Education Learning Video" disturbing and upsetting in a way that a Tom and Jerry cartoon isn't is hard, for algorithmic or human moderators.

YouTube says it has been working on the policy for a while, and that practices were not revised due to scrutiny in the media.

Like this: