News

A parent snaps a photo of a newborn baby on a cellphone amid news Apple is rolling out a new feature...
Mitch Diamond/Photodisc/Getty Images

Here’s What Parents Need To Know About Apple’s New Child Abuse Detection Tool

Apple will soon scan iPhones and iCloud for images of child abuse.

by Morgan Brinlee

In an effort to protect young children, Apple has announced plans to roll out a series of new features, including one that will scan photos stored on iCloud to detect known images of child sexual abuse. Developed with the help of child safety experts, the tools will center around Messages, iCloud Photos, and Siri and Search with the goal of helping children and parents better navigate online communication and internet browsing as well as limiting the spread of what’s called Child Sexual Abuse Material (CSAM).

“We want to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of CSAM,” Apple said in a press release when unveiling its new set of tools late last week. “This program is ambitious, and protecting children is an important responsibility. These efforts will evolve and expand over time.”

As the new features begins to roll out, here’s what parents need to know about Apple’s new child abuse detection tool and other upcoming safety features

Boosting Safety For Kids & Parents In Messages

The first of Apple’s new features will add a new suite of tools to the Messages app that will warn both children and their parents if the child receives or sends a sexually explicit image through Messages. According to Apple, images of this type will be blurred and a warning will appear on the child’s phone asking if they really want to view this image and providing helpful resources. Additionally, the feature can be set up to warn parents any time a child opts to view or send an image Apple has flagged as sexually explicit.

Apple says the feature will only work on-device on images that are sent or received via Messages and won’t impact the app’s privacy guarantees. What’s more, the feature can only be enabled on child and parent accounts set up in iCloud as Family Sharing and parental notifications can only be enabled for accounts set up for children under the age of 12. According to the Associated Press, children 13 and older can opt to unenroll themselves from the parental notification feature.

While Apple has yet to announce a specific roll-out date, it’s expected to become available to accounts set up on iCloud as belonging to families later this year.

Child Sexual Abuse Material (CSAM) Detection

In an effort to limit the spread of CSAM, Apple will introduce new cryptography applications to iOS and iPadOS that will allow the tech giant to scan images stored in iCloud Photos for known CSAM images. However, to detect known images Apple isn’t exactly scanning actual photos. Instead, it’s looking for unreadable hashes or strings of numbers confirmed by The National Center for Missing & Exploited Children (NCMEC) and other child safety organizations to be CSAM. When images are flagged, Apple will conduct what is called a human review, meaning an actual person will confirm the images are a match. Apple will then file a report with NCMEC.

But Apple won’t be scanning images in iCloud. Instead, it will compare images selected to be uploaded to iCloud with a database of known CSAM image hashes on-device. In a series of FAQ, Apple emphasized that it would not be scanning all photos a customer has stored on their iPhone or iPad — just those users decide to upload to iCloud. Additionally, Apple has said its CSAM detection feature will not work on user's iPhone photo library.

Apple has yet to announce when it will roll out its CSAM detection tool.

Will CSAM Detection Flag Innocent Photos Of My Own Kid?

According to Apple, no. First, the CSAM detection tool is designed only to flag images that are already in NCMEC’s database. Second, the human review feature serves to weed out any potential “false flags,” which Apple has said are likely to be few and far between.

“The system is only designed to report photos that are known CSAM in iCloud Photos,” the tech giant said in a FAQ about the feature. “The system is designed to be very accurate, and the likelihood that the system would incorrectly flag any given account is less than one in one trillion per year.”

That means any innocent photos you snap of your own kids be it while they are swimming or in the bath, aren’t likely to set off Apple’s CSAM detection tool.

Expanding Tools To Help Kids Stay Safe Online With Siri And Search

The final feature announced by Apple aims to help children and parents navigate the internet more safely. Users who search or ask Siri for information on reporting CSAM or child exploitation will be provided with resources to help them file a report. Additionally, Siri and Search will also intervene whenever a user searches for queries or keywords related to CSAM with warnings explaining why interest in that topic is harmful and problematic. The warnings will also contain resources to guide users to where they can get help.

Users with iOS 15, iPadOS 15, watchOS 8, and macOS Monterey can expect to see these new Siri and Search features roll out later this year.