COMMENTS On News: Apple To Scan Users iCloud Photos To Identify Child Abuse

By   ISBuzz Team
Writer , Information Security Buzz | Jan 09, 2020 06:23 am PST

Apple has started scanning photos uploaded from iPhones to check for child sexual abuse images, as tech companies come under pressure to do more to tackle the crime.

Jane Horvath, Apple’s chief privacy officer, revealed at CES 2020 that the company automatically screens images backed up to the company’s online storage service, iCloud, to check whether they contain the illegal photos.

Subscribe
Notify of
guest
1 Expert Comment
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
Paul Bischoff
Paul Bischoff , Privacy Advocate
January 9, 2020 2:26 pm

Apple is being very opaque about how it goes about scanning users\’ photos without breaking encryption. Apple says all data uploaded to iCloud is encrypted both in transit and on the cloud storage server. iPhones themselves are encrypted as well. That means any photos uploaded to iCloud should also be encrypted and in turn cannot be viewed. So at what point is Apple using the image matching algorithm to scan photos? Does it happen on users\’ devices before photos are encrypted, or are the photos unencrypted at some point when being sent to the server?

Here\’s what I think is happening: Apple has access to a law enforcement database of child abuse photos. Apple hashes or encrypts those photos with each users\’ security key (password) to create unique signatures. If the signatures of any encrypted photos uploaded from an iPhone match the the signatures from the database, then the photo is flagged and presumably reported to authorities. This allows Apple to match photos uploaded to the cloud against the law enforcement database without ever breaking encryption or actually viewing the photos.

This method is notably distinct from Apple simply decrypting a user\’s phone or iCloud at the behest of law enforcement, or giving law enforcement a backdoor to decrypt users files. It is more targeted and doesn\’t allow for much third-party abuse. It also targets child pornography, a category of speech not protected by the First Amendment.

Although computationally intensive, the tactic is actually quite simple. However, there is one drawback: If a child abuse photo is cropped or edited, if it\’s converted to another type of image file, or if it\’s compressed, then the encrypted signatures won\’t match up. It has to be an exact copy of the image from the law enforcement database.

Last edited 4 years ago by Paul Bischoff

Recent Posts

1
0
Would love your thoughts, please comment.x
()
x