In a video posted by the Wall Street Journal, Apple’s vice president of software engineering, Craig Federighi, offers more context regarding his company’s stance on the new photo scanning feature that looks at iCloud photos for instances of child sexual abuse.
The first aspect Federighi stresses is Apple isn’t technically scanning the photos on your iPhone or iPad. Instead, it’s scanning photos stored in iCloud. This means that if you opt-out of using iCloud, your photos are not scanned.
In the video, Federighi explains that the scanning technique is designed to protect user privacy while also stopping sexual predators from using iCloud photo storage.
He also mentioned that when images are uploaded to the cloud, they’re not actually scanned as a photo and instead are scanned in as a number. If this number matches similar scans from a database of child sexual abuse images, the picture is flagged as it gets uploaded. Then, Apple runs several more verification passes with software and an algorithm before humans sort through the remaining photos to verify them.
It should also be noted that Google, Microsoft and Facebook also scan photos for child abuse once the photos are uploaded to their servers. This means that the key distinction between Apple and the other companies is that the tech giant begins this process on your iPhone.
While critics worry that this gives Apple the ability to exploit user privacy, Federighi expresses that performing this first step on your phone is actually crucial to keeping Apple out of your business if you have nothing to hide.
Towards the end of the video, Federighi mentions that Apple has also added several layers to the photo scanning process so that if things do go wrong, independent third parties can verify them.
It’s important to note that this feature is currently only available in the U.S. and it’s unclear if it will make its way to Canada.