Hands of a guy on laptop keyboard

Apple’s new measures to protect children online on hold 

Published on 04 October 2021
Updated on 05 April 2024

The spread of child sexual abuse material (CSAM) has plagued the internet for decades. While new material appears every now and then, existing photos and videos keep re-appearing, mostly on the dark web. 

Scanning images

Apple’s new measures for scanning images for CSAM, announced recently, were meant to start detecting CSAM on iCloud Photos, and build in safety features into the Messages app (read our explainer of the tools) – only that they got ‘temporarily’ shelved. The reason is that the measures involved scanning images on users’ devices.

Apple posted an updated on 3 September. It said that ‘Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.’

The feedback which the company refers to is the criticism directed at Apple immediately after the measures were announced. 

Privacy concerns

The 90+ policy organisations who objected to Apple’s plan warned of at least two main issues. The first is that Apple’s ability to scan iCloud Photos is a privacy breach in itself and could ‘bypass end-to-end encryption, thus introducing a backdoor that undermines privacy for all Apple users’. The second is that Apple could be strong-armed by governments to use the tool for their own undemocratic purposes, such as identifying other types of content from people’s phones. Both are very legitimate concerns.

Apple’s response explains how the tool was designed with privacy in mind. It explains that the tool does not provide information to the company about any photos other than those that match known CSAM images. Yet, Apple’s response did not manage to assuage the concerns. A ‘backdoor is still a backdoor’, privacy group EFF explained. In simple terms, ‘Apple is planning to build a backdoor into its data storage system and its messaging system… At the end of the day, even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.’ 

What will it cost?

The debate on encryption will not be an easy one. Apple has since promised to revisit the measures and make improvements. If the company manages to do so without creating a backdoor, it will be a win for everyone, including users and human rights defenders. Yet, it’s unlikely that the measures will be as effective without some form of reduced encryption. 

Ultimately, there’s a price to pay to ensure that children are protected. As things stand, the choice will come at a high price for overall user privacy, and the revised measures will likely still come at some cost. So the question is: Is it a price worth paying? Are we ready for a trade-off, in order to protect children?

Who decides?

The people who should decide are the victims of CSAM themselves. Yet, since the victims are minors, they rely on adults to decide for them. Adults who have a responsibility to protect them. Adults who need to seriously consider putting children’s best interests before their own, if it comes down to that.

So, put yourself in the shoes of any kid who’s fallen victim of sexual abuse: Every time those photos re-emerge, you’re reliving the trauma again. And again, and again. So I ask again: Are we ready for a trade-off in order to protect children?

Based on analysis published in September’s Digital Watch newsletter. For more digital policy analysis, subscribe to our weekly and monthly publications.

1 reply

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

The reCAPTCHA verification period has expired. Please reload the page.

Subscribe to Diplo's Blog