Apple to Announce Client-Side Photo Hashing System to Detect Child Abuse Images in Users’ Photos Libraries

By   ISBuzz Team
Writer , Information Security Buzz | Aug 09, 2021 02:24 am PST

Apple is reportedly set to announce new photo identification features that will use hashing algorithms to match the content of photos in users’ photo libraries with known child abuse materials, such as child pornography. Apple’s system will happen on the client — on the user’s device — in the name of privacy, so the iPhone would download a set of fingerprints representing illegal content and then check each photo in the user’s camera roll against that list. Presumably, any matches would then be reported for human review.

Subscribe
Notify of
guest
2 Expert Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
Chris Hauk
Chris Hauk , Consumer Privacy Champion
August 9, 2021 10:34 am

<p>While I am all for clamping down on child abuse and child pornography, I do have privacy concerns about the use of the technology. A machine learning system such as this could crank out false positives, leading to unwarranted issues for innocent citizens. Such technology could be abused if placed in government hands, leading to its use to detect images containing other types of content, such as photos taken at demonstrations and other types of gatherings. This could lead to the government clamping down on users\’ freedom of expression and used to suppress \"unapproved\" opinions and activism.</p>

Last edited 2 years ago by Chris Hauk
Paul Bischoff
Paul Bischoff , Privacy Advocate
August 9, 2021 10:31 am

<p>Apple hinted that it was scanning iCloud images for child abuse content some months ago, so the announcement that they\’re now scanning users\’ phones doesn\’t come as a surprise. Although there are privacy implications, I think this is an approach that balances individual privacy and child safety. The important thing is that this scanning technology is strictly limited in scope to protecting children and not used to scan users\’ phones for other photos. If authorities are searching for someone who posted a specific photo on social media, for example, Apple could conceivably scan all iPhone users\’ photos for that specific image.</p>
<p>The hashing system allows Apple to scan a users\’ device for any images matching those in a database of known child abuse materials. It can do this without actually viewing or storing the user\’s photos, which maintains their privacy except when a violating photo is found on the device. The hashing process takes a photo and encrypts it to create a unique string of numbers and digits, called a hash. Apple has hashed all the photos in the law enforcement child abuse database. On users\’ iPhones and iPads, that same hashing process is applied to photos stored on the device. If any of the resulting hashes match, then Apple knows the device contains child pornography.</p>

Last edited 2 years ago by Paul Bischoff

Recent Posts

2
0
Would love your thoughts, please comment.x
()
x