Updated YouTube Kids security policy lets viewers flag inappropriate content

YouTube Kids Security homepage - Google services

The children’s branch of Google’s massively popular video streaming service lets viewers participate.

A new YouTube Kids security policy has been implemented to make it possible for viewers to flag inappropriate videos targeting children. This will make it possible for service users to be able to call to YouTube’s attention the presence of disturbing and otherwise child-unfriendly videos using popular children’s characters among their keywords.

This feature is meant to allow users to work to age restrict certain kinds of videos to keep them in the main YouTube app.

Once content is age restricted, it is still viewable within the main app, but the YouTube Kids security will keep it from showing up in the brand’s children’s application. “Age-restricted content is automatically not allowed in YouTube Kids,” said a statement from YouTube.

That said, participation in flagging videos with age restrictions is not limited exclusively to the children’s app. Users of the main YouTube app can also flag inappropriate content in order to help to keep it from being available in the Kids app at all.

While some may believe that this is another step being taken by the video streaming platform in response to recent issues making media headlines, the company insists that they are not related.

The new YouTube Kids security policy feature has been in the works for quite some time, said the company.

YouTube Kids Security homepage - Google servicesPreviously, YouTube had been using its algorithm to screen out as many of the inappropriate videos as they could. However, it became rapidly clear – particularly more recently – that this was not adequately keeping inappropriate content out of the children’s platform. For instance, one of the prime examples cited in recent media stories was a video featuring Peppa Pig – an extremely popular kids’ show character – drinking bleach. Similarly, another video showed that same character having his teeth violently extracted by a dentist.

Clearly, these were not official videos made by the creators of the Peppa Pig show. However, when these parodies are created and are labeled with tags mentioning the character’s name, the previous YouTube Kids security screening wasn’t catching it. To the algorithm, it was indecipherable from a legitimate clip from the show. Since this summer, the platform has demonetized videos making “inappropriate use of family characters,” but this additional step is meant to ensure that children don’t see the videos that are still being made.

Leave a Comment


This site uses Akismet to reduce spam. Learn how your comment data is processed.