Like most kids, my toddler son loves Paw Patrol. He will sing the theme song randomly or call out "Chase" whenever he sees the police pup. And almost every night, my little dude will crawl onto the couch and wait for me to change the channel to Nick Jr. so he could watch the show. I couldn't imagine, though, how he'd react if we accidentally stumbled upon one of those creepy knock-off YouTube videos. Luckily, YouTube has introduced a new policy to deal with those fake Paw Patrol videos.
According to Parents, YouTube will now put age restrictions on videos that makes inappropriate use of recognizable children's characters from shows and movies like Paw Patrol, Peppa Pig, Frozen, Thomas The Train, Spiderman, and more. That means the company will age-restrict content that's flagged in the YouTube main app, automatically blocking it from sneaking into the YouTube Kids app, which is watched by millions of children under 13 years old. Earlier this month, the New York Times reported that YouTube videos that featured favorite kids characters in disturbing situations continued to slip through the cracks, hitting the kids app.
Junpier Downs, YouTube's director of policy, said in an email statement to Romper:
We’re in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged. Age-restricted content is automatically not allowed in YouTube Kids. The YouTube team is made up of parents who are committed to improving our apps and getting this right.
In other words, as Mashable pointed out, YouTube is counting on its users to flag those fake — and creepy — videos so the content doesn't filter into the YouTube Kids app. (According to Mashable, YouTube main videos may take several days to hit the kids' section.) So if a parent is watching YouTube and comes across a video of Peppa Pig being tortured, they would then flag that video within that window, sending it to YouTube's policy review team.
According to the Verge, which first reported the story, YouTube's team of moderators will then review content flagged on YouTube proper and place age restrictions on videos found in violation of the new policy. This practice, YouTube told the Verge, will roll out in the coming weeks. As part of the policy, though, flagged content will be age restricted in YouTube's main app, meaning users can't view the videos if they're not logged in or using accounts registered to users age 18 or older.
But the age restriction isn't the only line of defense against these bizarre videos. YouTube told Romper that the company uses machine learning and algorithms to first filter out inappropriate content before it reaches the kids' app. The YouTube Kids app also has its own team of moderators who sift through flagged content.
YouTube told Romper than less than .005 percent of videos viewed in the YouTube Kids app were flagged for being inappropriate. The company also stressed that the policy change is an added layer of protection for kids app content that goes beyond the algorithm filters and parental controls.
The new policy also expands on actions the company has already taken against the makers of these disturbing videos. In August, YouTube began to restrict users from receiving advertising revenue for "content featuring inappropriate use of family entertainment characters," Downs said in the statement to Mashable.
As a parent, I appreciate the measure YouTube is trying to take in rectifying what is really a horrifying and traumatizing situation. Yet I am also skeptical that this new policy will actually solve the problem. Relying on users to flag inappropriate content is idealistic at best, misguided at worst. And, of course, algorithms are far from perfect. But I guess I and other parents will have to wait and see what comes of this.
For now, give YouTube Kids a few days for the new policy to kick in, and hopefully some of the videos will be filtered out. In the meantime, the PBS Kids app is pretty cool, too, if you ask me.