Clearview AI Leaves Internal Data Exposed Including 70,000 Videos Of Residential Building – Expert Insight

By   ISBuzz Team
Writer , Information Security Buzz | Apr 20, 2020 06:13 am PST

Clearview AI, the facial recognition startup, has left a misconfigured server exposed, which included data of the company’s internal files, apps and source code left open for anyone on the internet to access and run the apps from scratch. In addition to this, 70,000 videos from a residential building security camera were also left in one of Clearview’s cloud storage buckets, which depict the residents entering and leaving the building.

Notify of
2 Expert Comments
Oldest Most Voted
Inline Feedbacks
View all comments
Chris DeRamus
Chris DeRamus , VP of Technology Cloud Security Practice
April 26, 2020 7:25 pm

Clearview AI has gained a lot of attention not only from critics who are concerned about the privacy implications of its facial recognition technology, but also from hackers. Regardless of your personal feelings about the company, Clearview’s second security lapse in just two months demonstrates how common misconfigurations are when companies lack proper cloud security strategies, and how easily threat actors can exploit these vulnerabilities. A misconfigured server opened a window for cybercriminals to steal Clearview’s intellectual property, including its source code and the credentials used to access the cloud storage buckets that held various versions of its app. The hackers could potentially sell the exposed information to the company’s competitors, leverage the information to find and exploit other weaknesses in its app, or commit any number of devious acts.

From 2018 to 2019, the number of records exposed by misconfigurations rose by 80%. The challenge is that many organizations struggle to adopt and enforce cloud security best practices consistently, and only 100% consistency protects against a devastating data leak or breach. The risk of a breach is far too high to solely rely on human intervention. Security tools that can detect and alert security teams of misconfigurations while patching these issues automatically can safeguard corporate data from falling into the wrong hands. Additionally, this particular misconfiguration incident highlights the need for enterprises to adopt least-privileged access across cloud environments, including a robust approach to identity and access management (IAM). In these environments, everything has an identity – users, applications, services, and systems. Organizations must implement multi-factor authentication (MFA) for all users, securely manage service accounts and their corresponding keys, enforce least-privileged access, and enforce best practices for the use of audit logs and cloud logging roles.

Last edited 3 years ago by Chris DeRamus
James Carder
James Carder , Chief Information Security Officer & Vice President
April 20, 2020 2:14 pm

Clearview AI’s cloud data buckets were left vulnerable, and unfortunately, these oversights caused their facial recognition apps and private data to be left open on the internet for anyone to access. Additionally, thousands of videos from a residential building were left open on the server, a violation of privacy and potential danger to those on camera.

For companies like Clearview AI, that store and manage facial recognition software and data, it is crucial to implement necessary authentication and authorisation, security monitoring, detection, intelligence and response capabilities. Real-time monitoring and clear visibility are essential to mitigating threats like this one and could have easily prevented this security lapse. This unfortunate instance is another case of bad IT practice with lax security controls without monitoring and alerting. Furthermore, the lack of two factor authentication, allowed anyone the ability to register and gain access to the database, circumventing password protection altogether. Overall, the protections Clearview had in place does not match the critical data they are responsible for protecting.

Last edited 3 years ago by James Carder

Recent Posts

Would love your thoughts, please comment.x