The security researcher Jeremiah Fowler discovered two folders of medical records in possession of artificial intelligence company Cense AI available for anyone to access on the Internet. The data was labeled as “staging data” and is believed to temporarily hosted online before loading it into the company’s management system or an AI bot. The medical records are quite detailed and include names, insurance records, medical diagnosis notes, and payment records. It looks as though the data was sourced from insurance companies and relates to car accident claims and referrals for neck and spine injuries.
Sensitive insurance claims processing data, which looks to be in the data in question, is regulated under HIPAA, GLBA, and various state security and privacy mandates in the US. Yet clearly, this data interchange lacked any data security to meet such rules. To receive such information, organizations must at least operate under a HIPAA Business Associate Agreement with the data provider. The BAA outlines mandatory data security controls including data de-identification, encryption, and audit. While the benefits of third-party AI services are clear, to avoid breaches like this, the data owner as well as the AI service should also consider protecting the data set before sharing and use, for example, with modern data-centric tokenization. This technology balances insight and utility with exposure risk, enabling insight and use of data in low-trust IT. In this case, there’s likely to be significant regulatory response cost which could have been avoided with some very low cost and simple data-security investments that pale in comparison to the cost of remediation.
Cybercriminals could use the information exposed in this breach for health insurance fraud and phishing. Criminals could use the information to get treatment or prescriptions in someone else\’s name. Affected patients should also be on the lookout for scammers posing as their insurance company or a related organization.
Sadly, incidents like this, and many others are a sobering reminder that our personal medical information is always at risk of being exposed. Medical information is always some of the most valuable information for bad actors, and these days of COVID-19, this has never been more true.
Companies need to learn to secure data, even if it is just being temporarily stored before moving it to a secure system. Any data that is not secured properly is up for grabs.
I feel like I am forced to say this on a daily basis in recent months, but here goes. Consumers need to be on their toes, staying alert for any bad guys that may have gotten their hands on this data and are using it in an attempt to glean more information or perpetrate monetary fraud by posing as a member of a billing firm or even worse, a member of medical staff.