Apple introduces a detection mechanism to flag child sexual abuse material in the United States

A child looks at her phone in India.

AT A GLANCE

  • Starting with the US, Apple will soon be rolling out a detection mechanism for images of child sexual abuse.
  • The new system works by using cryptography to detect child sexual abuse material and report them to the National Center for Missing and Exploited Children.
  • Though the tension between safety of children and privacy of users is ongoing, this new identification system will undoubtably help flag – and remove – known child sexual abuse material.

Apple introduces a detection mechanism to flag child sexual abuse material in the United States

Starting with the US, Apple will soon be rolling out a detection mechanism for images of child sexual abuse, a landmark move to protect children’s safety online. In 2020, reports of online enticement increased by nearly 98 per cent – illustrating the need for technology companies to do more to prevent and end child sexual abuse online. According to Apple, This feature is coming in an update later this year to accounts set up as families in iCloud for iOS 15, iPadOS 15, and macOS Monterey.

“If we truly want to end online child sexual abuse, this is the type of engagement from the technology sector that we need,” said Marija Manojlovic, the Director of the End Violence Partnership’s Safe Online initiative. “In the future, we would also welcome a higher degree of engagement between the technology sector and organizations who are working in this field, especially those that are supporting vulnerable children and victims of abuse.”

If we truly want to end online child sexual abuse, this is the type of engagement from the technology sector that we need.

Marija Manojlovic, the Director of the End Violence Partnership’s Safe Online initiative

The new technology works by scanning images when they are being uploaded onto iCloud Photos. In the process, an automated detection system will flag images previously confirmed to be child sexual abuse material by the National Center for Missing and Exploited Children (NCMEC).

This triggers a multi-step process, in which that content is reviewed by a human. If the material is confirmed to feature child sexual abuse, the user’s account will be disabled, NCMEC will be notified, and the content will be removed.

“At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe,” Apple said in a recently released statement. “We want to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of child sexual abuse material.”

Additional safety measures include communications tools that help parents support their children’s digital safety, warnings about sensitive content through iMessage, and updates to Siri and Search that helps parents and children access expanded information about online sexual abuse.

Though the tension between safety of children and privacy of users is ongoing, this new identification system will help flag – and remove – child sexual abuse material and point to a future where the privacy of users’ content is preserved while children are also protected from online sexual abuse.

Through Together to #ENDviolence, we are calling for technology companies, governments, organisations and institutions to do more to protect children online. The internet holds great potential for children to connect, explore, and learn in creative ways – but we all have a duty to reimagine what a safe internet for children looks like. Together, we can proactively create policies, technologies and spaces that put children’s health and safety first.

Learn more about our work to keep children Safe Online.

Photo: UNICEF/UN0491424/Vishwanathan