As parents, we want our kids to remain as safe as possible online — whether it's while they're watching YouTube videos, chatting with family members on Facebook's Kids Messenger, or looking up information for a research paper. Unfortunately, though, it's difficult for parents to protect them from everything, especially when it comes to websites and apps collecting personal data. Which is why it's a bit unnerving as a parent to hear Google has been fined up to $200 million for allegedly collecting data on children.
But first, let's get a few things straight. The Federal Trade Commission (FTC) has strict laws preventing online services from collecting personal information from kids under 13 without parental consent. (This includes details like a child's contact information, birth date, photos, location, and more.) In fact, the Children’s Online Privacy Protection Act (COPPA) specifically forbids companies from collecting personal data from children — and then using this data in its advertising algorithms and programming, according to The New York Times. FTC law also prohibits kids' apps from using "persistent identifiers" to target kids with ads based on their behavior.
So you can imagine how parents must have felt when it was revealed YouTube was under investigation for doing all of the above. (It's worth noting YouTube Kids doesn't allow ads based on behavior.) And now Google — which owns YouTube — is paying for it.
Google has agreed to pay between $150 and $200 million to settle an FTC investigation involving an alleged COPPA violation by YouTube, Politico reported on Aug. 30. The FC voted 3 to 2 to approve the settlement, which is the latest in a string of crackdowns on privacy violations. (In February, the FTC fined TikTok $5.7 million, according to CNN. And Facebook paid a whopping $5 billion in July.)
“They should levy a fine which both levels the playing field, and serves as a deterrent to future COPPA violations. This fine would do neither,” Josh Golin, executive director of coalition leader for the Campaign for a Commercial-Free Childhood, said in a statement, according to Politico.
In recent years, the FTC has been taking a different approach on how it enforces COPPA, the Washington Post reported. Websites, video games, and more that aren't specifically marketing to kids, per se — but still manage to attract a significant amount of kid users — seem to be catching the FTC's eye. And it's not hard to see why. After all, some of YouTube's most popular channels — like Cocomelon, Nursery Rhymes, and more — are obviously meant for young people.
It's worth noting that in order to set up an account on YouTube, users must technically agree to Google's terms of service — confirming that they're 13 or older. This is what allows the company to track users' activities, browsing, and more. YouTube also says it deletes accounts if it determines the user is younger than 13. Still, YouTube announced earlier in August — likely because of the FTC investigation — that it would stop allowing targeted ads in videos that children are more likely to watch. So that's definitely a step in the right direction.
One thing's for sure: Companies like YouTube and Facebook must be held to a higher standard when it comes to kids' privacy. And in the big scheme of things, this $200 million slap on the wrist from the FTC isn't sending a strong-enough message.