Apple has apparently been listening to its critics. On Friday, it announced it was going to delay a controversial technology that would scan users’ iPhone photos before they went up to the iCloud to check them for known child sexual abuse material (CSAM).
In its statement sent to various media outlets, a spokesperson said: “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple had laid out plans in which it would use code to scan image files and check them against databases of known CSAM images. It would do this by having code on an iPhone look not at the photos themselves, but at hashes, unique alphanumerical representations of an image. If hashes on the phone matched those of known CSAM images, Apple staff would then check the photo to see if it really was child exploitation material, before deciding whether or not to pass the information onto police investigators.
But Apple faced a significant backlash over the rollout of the feature, despite its good intentions of catching those sharing child abuse images and videos on its phones and servers. The main cause of anxiety was that Apple was doing the scanning on the device without the user’s permission and that there was the potential for a government to demand Apple to start scanning for other, non-CSAM content. China, for instance, could pressure Apple to start looking for images that expressed anti-government sentiment, critics feared.
Apple sought to allay fears by releasing a number of FAQs, in which it said it would never bow to such demands and noting that the scanning, though it occurred on iPhones rather than in the iCloud, would only cover photos that were set to go to the iCloud. It was possible for users to stop Apple scanning their images by simply turning off photo sharing with the iCloud.
It then went further, saying it would only ever manually look at photos if more than 30 matches were found on a device.
And now, a month after first announcing the feature, it is being delayed. It was due to roll out with iOS 15. It’s now unclear when it will be pushed out to users’ iPhones.