UK And Australia Open Joint Data-Privacy Investigation Into Clearview AI

As reported by CNET, the governments of the UK and Australia are investigating a facial recognition company that grabbed billions of people’s pictures from across the internet for use in its database. The inquiry will look at Clearview AI and whether its scraping and handling of data violated the UK Data Protection Act and the Australian Privacy Act.

The joint investigation comes three days after the Office of the Privacy Commissioner of Canada said that Clearview AI will be leaving Canada in response to a separate investigation by that agency. The office of Canada’s privacy commissioner said it’s still investigating how Canadian police used the facial recognition tool, and how Clearview AI would delete data belonging to Canadians.

Experts Comments

July 10, 2020
Tim Mackey
Principal Security Strategist, Synopsys CyRC (Cybersecurity Research Center)
Synopsys
It really hasn’t been a good few months for facial recognition companies. Starting with the revelation of a data breach at Clearview AI, jurisdictions around the world have put in place moratoriums on the use of facial recognition technologies by law enforcement. Facial recognition is a form of artificial intelligence meaning that building a business around it requires a source of training data. Most professional photographers know that if you take a picture of someone’s face, then you are.....Read More
It really hasn’t been a good few months for facial recognition companies. Starting with the revelation of a data breach at Clearview AI, jurisdictions around the world have put in place moratoriums on the use of facial recognition technologies by law enforcement. Facial recognition is a form of artificial intelligence meaning that building a business around it requires a source of training data. Most professional photographers know that if you take a picture of someone’s face, then you are going to need to obtain permission from that person if you intend to publish or otherwise use that photo. That process is known as obtaining a release, and effectively the signed document states how that image could be used. Image websites will of course sell stock photos containing faces, but there too a license is associated with each photo. Training any artificial intelligence system requires large quantities of data, so in the context of facial recognition that becomes large quantities of faces. If you have a limited set of photos, then any conclusions from the AI are suspect due to the bias present in that training set. Clearview AI is asserted to have over 3 billion images in its database. That would imply a very large set of training and reference data – something that could in theory increase accuracy and limit intrinsic bias. Increased accuracy and limited bias would then translate into more accurate identification, which would be a prime selling point for law enforcement agencies. Obtaining the legal rights for such a large dataset would be expensive, and it’s asserted that Clearview AI bypassed image licenses and simply scraped the data from websites. This process would reduce the cost of image acquisition, but could also have allowed the Clearview AI team to identify weaknesses in social media applications. It will be interesting to see how Clearview AI responds to the ICO’s investigation and what is discovered. Eventually I predict it will exit the UK and Australian markets as it’s done in Canada. Of note to consumers, it’s important to understand what your rights are when you share photos on the internet or within apps. For example, that cool image aging software could in reality be a simple way to obtain your image and profile data for use in a training data set for facial recognition software.  Read Less
July 10, 2020
Jake Moore
Cybersecurity Specialist
ESET
Facial recognition systems have evolved at such a rapid pace that they can now identify an individual remarkably quickly. We just need to stay mindful that when correlated with social media data, it can quickly profile us and know far more than just our movements - and sometimes without explicit consent from the people in the imagery. Furthermore, facial recognition is still in its early immature phase and needs more technological advancement before it is used in daily life.
What do you think of the topic? Do you agree with expert(s) or share your expert opinion below.
Be part of our growing Information Security Expert Community (1000+), please register here.