YouTube has announced that it will “age restrict” specific videos on the platform after inappropriate clips targeting children have passed through the site’s filters.
According to The Verge, the clips “make use of popular characters from family-friendly entertainment, but it’s often created with little care, and can quickly stray from innocent themes to scenes of violence or sexuality.”
After The Verge reported about the matter in February, YouTube announced in August that it would disallow creators to monetize videos which “made inappropriate use of family-friendly characters.”
As a follow-up, YouTube director of policy Juniper Downs said early this year: “We’re in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged.”
“Age-restricted content is automatically not allowed on YouTube Kids.”
YouTube, however, said the new policy had been on the drawing board for some time and was not in response to reports of bizarre videos on the platform aimed at youngsters.
The video site launched YouTube Kids in 2015 to bring content ideal for children to watch but according to TechCrunch, “gruesome videos portraying sex, drugs, and violence have been sneaking their way in for some time.”
YouTube addressed the issue by allowing its algorithm to remove inappropriate content, “but that clearly hasn’t been working,” TechCrunch said.
An example was mentioned in the Medium post. It showed cartoon character Peppa Pig drinking bleach, and another clip showed Peppa getting her teeth violently yanked at the dentist.
The clips were apparently not made by Peppa Pig’s producers, TechCrunch added.
Meanwhile, YouTube said the new policy “would serve as an extra layer of protection beyond the filters already in place.”
YouTube has cracked down on inappropriate videos aimed at children.