News

Christopher Furlong/Getty Images News/Getty Images

What Every Parent Needs To Know About YouTube's Huge Changes To Protect Children

by Josie Rhodes Cook

If you're a parent who has let their kid watch videos on YouTube, you may have heard that there have been some disturbing videos and comments getting past "family-friendly" filters on the site recently. Now, several major advertisers have stopped running advertising on the site after the companies learned that their ads were sometimes paired with offensive content directed at kids. That's a big reason that YouTube is making some changes to better protect kids who may be using the site — changes that include pulling hundreds of accounts, a ton of videos, and numerous channels with "unacceptable" content.

Over the past week, YouTube reported that is has “terminated more than 270 accounts and removed over 150,000 videos from our platform," according to Vice News. The company has also turned off comments on more than 625,000 videos targeted by child predators, The Verge reported.

A YouTube spokesperson told Romper in a statement:

In recent months, we've noticed a growing trend around content on YouTube that attempts to pass as family-friendly, but is clearly not. While some of these videos may be suitable for adults, others are completely unacceptable, so we are working to remove them. As a result, we’ve taken aggressive action, including removing over 150,000 videos from our platform that do not belong on YouTube, turning off inappropriate comments, terminating hundreds of accounts and adding age-gates for videos that are clearly geared towards mature audiences. Beyond that, to reduce the incentive to produce this content, we’ve made over 2 million videos, including 50,000 entire channels, ineligible for ads.

The move came after companies like Adidas, Mars, Hewlett-Packard, and other big names paused advertising on YouTube following reports that showed their ads were showing up "alongside sexually explicit comments under videos of children," according to Vice News. And YouTube has reportedly figured out how some of the offensive content has been showing up. Apparently, the tools the site uses to screen inappropriate comments haven’t been working correctly for over a year, according to volunteer moderators who spoke with the BBC.

Because of that bug, an estimated 50,000 to 100,000 “predatory” accounts remained on YouTube, Vice reported.

In a statement posted to the official YouTube blog, the team behind the site laid out what they have done, and what they're planning to do, in an effort to block unacceptable content that is trying to pass as "family-friendly" in the future.

The steps include "tougher application of our Community Guidelines and faster enforcement through technology," including an expansion of enforcement guidelines regarding removing content featuring minors that may be endangering a child, "even if that was not the uploader’s intent," the blog read. YouTube is also working to remove ads from "inappropriate videos" that target families.

Since June, YouTube says it has removed ads from 3 million videos under the policy, and strengthened how the company applies the policy. That's a start, at least.

Dan Kitwood/Getty Images News/Getty Images

YouTube is also beginning to be even more aggressive about turning off all commenting on videos of kids with "inappropriate sexual or predatory comments," the statement said.

Johanna Wright, the Vice President of Product Management at YouTube, has a particular stake in this whole thing, because as she wrote in the statement, she's a parent herself. At the end of the statement, she wrote:

We’re wholly committed to addressing these issues and will continue to invest the engineering and human resources needed to get it right. As a parent and as a leader in this organization, I’m determined that we do.

Hopefully all these changes will lead to less "exploitative content" with children in distress or abusive situations on YouTube.

And while that is certainly an important and worthy goal, there's still the matter of other disturbing content on the platform that parents have brought to YouTube's attention in the past: namely, those creepy "knock-off" videos featuring characters from beloved kids shows like Paw Patrol that end up on YouTube Kids.

That same blog post from YouTube says the company is working to "age-restrict...content with family entertainment characters but containing mature themes or adult humor," so that it's only available to people over 18 who are logged in. That content likely includes any that have characters from shows like Paw Patrol or movies like Frozen doing things that could be disturbing to children.

A YouTube spokesperson told Romper:

Earlier this year, we updated our policies to make content featuring inappropriate use of family entertainment characters ineligible for monetization. We’re rolling out a new policy that age restricts this content in the YouTube main app when flagged. Age-restricted content is automatically not allowed in YouTube Kids. The YouTube team is made up of parents who are committed to improving our apps and getting this right.

When The New York Times reported on those creepy videos bypassing past the YouTube Kids security filters, Malik Ducard, YouTube’s global head of family and learning content, said "parents are in the driver’s seat" when it comes to finding inappropriate content. He told the Times that parents are encouraged to report inappropriate videos, which someone at YouTube will then manually review.

But if YouTube knows these things are a problem, especially on such an epic scale that millions of videos on the platform are affected, it should be up to YouTube to try to fix the issue, not parents. Parents can help, of course, but it's the company's responsibility to try to curb the problem as much as possible even without their help. At least it sounds like these changes are a good first step in the right direction.

Check out Romper's new video series, Romper's Doula Diaries:

Watch full episodes of Romper's Doula Diaries on Facebook Watch.