YouTube Kids has ramped up their strategy to keep inappropriate videos away from children and it involves the help of the platform’s users. A new age-restrict process will help to make sure children won’t be exposed to popular kids show characters portrayed in disturbing videos.
Not All Peppa Pig is Real
Recently, a number of Peppa Pig themed videos made their way into the YouTube Kids app. They were not produced by the official creators of the popular children’s show. They were parodies that featured the title character in violent scenes at the dentist or drinking bleach. Clearly, these videos are not suitable content for kids, but the tags and keywords for the video included the character’s name, fooling the algorithm into thinking it was meant for children.
This type of issue has been a problem since the original 2015 launch of the YouTube Kids app, but it has been increasingly problematic of late. While the algorithm can screen out a large portion of the violent, sexually explicit and other forms of inappropriate content for children, some of those videos are still reaching kids through the family-friendly application.
YouTube Needs Your Help
Google’s video streaming service has, therefore, boosted their efforts to stop kids from having access to those videos by asking for viewer participation. The new policy allows YouTube and YouTube Kids users to flag inappropriate videos aimed at children for age restriction. Any video that has been age restricted will be kept on the platform and will be permitted in the main YouTube app but will be blocked from YouTube Kids.
“Age-restricted content is automatically not allowed in YouTube Kids,” explained YouTube in a statement. It is a policy the platform has had in the works for a while and that has arrived at a time in which the new process is particularly relevant.
Age-Restricted Videos With Kids’ Characters are Demonetized
This is not the first step the company has taken recently to help overcome this problem. In August, YouTube announced that it would be further supporting its algorithm’s screening process by demonetizing videos portraying popular kids’ show characters in scenes of violence, sex, drugs or other inappropriate themes.
The goal had been to help discourage creators from sharing those videos in the first place. That said, they have continued to appear and were still finding their way to YouTube Kids.
YouTube’s policy now sends new videos to the main app first, allowing viewers to see it and, if appropriate, flag it as being age-restricted. That way, the algorithm will have additional assistance from real viewers to decide whether or not it is child-friendly.