Apple has started scanning photos uploaded from iPhones to check for child sexual abuse images, as tech companies come under pressure to do more to tackle the crime.
Jane Horvath, Apple’s chief privacy officer, revealed at CES 2020 that the company automatically screens images backed up to the company’s online storage service, iCloud, to check whether they contain the illegal photos.
The opinions expressed in this post belongs to the individual contributors and do not necessarily reflect the views of Information Security Buzz.