Apple has started scanning photos uploaded from iPhones to check for child sexual abuse images, as tech companies come under pressure to do more to tackle the crime.
Jane Horvath, Apple’s chief privacy officer, revealed at CES 2020 that the company automatically screens images backed up to the company’s online storage service, iCloud, to check whether they contain the illegal photos.
Experts Comments
What do you think of the topic? Do you agree with expert(s) or share your expert opinion below.
Be part of our growing Information Security Expert Community (1000+), please register here.
Be part of our growing Information Security Expert Community (1000+), please register here.
Linkedin Message
@Paul Bischoff, Privacy Advocate, provides expert commentary at @Information Security Buzz.
"This method is notably distinct from Apple simply decrypting a user\'s phone or iCloud at the behest of law enforcement...."
#infosec #cybersecurity #isdots
https://informationsecuritybuzz.com/expert-comments/comments-on-news-apple-to-scan-users-icloud-photos-to-identify-child-abuse
Facebook Message
@Paul Bischoff, Privacy Advocate, provides expert commentary at @Information Security Buzz.
"This method is notably distinct from Apple simply decrypting a user\'s phone or iCloud at the behest of law enforcement...."
#infosec #cybersecurity #isdots
https://informationsecuritybuzz.com/expert-comments/comments-on-news-apple-to-scan-users-icloud-photos-to-identify-child-abuse