Another huge leak of government information – a huge amount, 3 terabytes, of unprotected data from theOklahomaSecurities Commission wasuncoveredby Greg Pollock, a researcher with cybersecurity firm UpGuard. It amounted to millions of files, many on sensitive FBI investigations, all of which were left wide open on a server with no password, accessible to anyone with an internet connection.
Expert Comments below:
Kevin Bocek, Vice President, Security Strategy & Threat Intelligence at Venafi:
“Sensitive data is often shared in vulnerable places, soOklahoma’s potential breach of 3TB of FBI data isn’t especially shocking.
However, if we examinesecurities.ok.gov, it appears that the state is not using trusted machine identities, like TLS keys and certificates. Today, browsers are marking this site as ‘not secure,’because it is not using HTTPS encryption. Thismeansthatbrowsers do not trust the machineidentities used to identifyOklahoma’s servers.
In 2015, the Obama administration required all US federal agencies to use machine identities and HTTPS. But studies have shown that26% of public US federal government serversarestillnot using TLS keys and certificates or HTTPS. Unfortunately,using machine identities effectivelyis often so challenging that many organizations just don’t take the steps neededto keep their data and private communications safe.
Ultimately, until organizations automate the useand protectionof machine identities, we’ll continue to see exampleslike thisof machine identity failuresthat weakensecurity.”
Pravin Kothari, CEO atCipherCloud:
“The trend is not our friend in 2019. Cyberattacks and data breaches continue to be announced at record rates. Administrative and management errors at theOklahomaSecurities Commission exposed 3 terabytes of data on their servers going back to 2012. To make it worse, most of the data across these many millions of files were not encrypted.
The moral of the story is the same. Automation is required to manage and check configurations and administration, both on-premise and in the cloud. Even if exposed accidentally if the data is encrypted, it is not by definition breached. Encryption must be used from the enterprise “edge.” All data available to online access should be encrypted, not only in the database but during transmission, middleware, API transit and in use. Tools like data loss prevention (DLP) can make sure sensitive data is *always* encrypted. Finally, identity access management (IAM) technologies such as 2-factor authentication (2FA) and single single-on (SSO) further harden your infrastructure and protect access to your systems and networks.
Yes, there are sophisticated advanced threats that are sometimes impossible to stop. But most of these victims like theOklahomaSecurities Commission are falling prey to attackers that are exploiting very basic weaknesses in their cyber defense.”
Matan Or-El, Co-Founder and CEO at Panorays:
“Data security is not necessarily always about protecting from attackers; quite often it’s about protecting against mistakes. The Oklahoma data leak is the latest in a long series of incidents in which sensitive data was exposed to the internet by mistake, where anyone could access it. By continuously monitoring the attack surface of an organization, one can learn a lot about the security and data hygiene practices of an organization. This is what is needed to detect mistakes and assess the overall cyber posture of enterprise data and third-party data protection practices.”
Sam Curry, CSO atCybereason:
“The agency needs a high level set of answers and updates. They can appear to be heroes or villains in this, but they don’t have the luxury of being victims. We the people need to know the root cause, how it won’t happen again and the contamination: who used the data, what they did with it and the implications on cases affected. The process of discovery isn’t comfortable for anyone, but the FBI and DOJ need to take the same medicine they frequently dispense in the name of justice. The rule of law requires or and we the people require it. Quis custodiet ipsos custodes: who watches the watchmen. Now is the chance to be a hero by being open and honest and organized and clear.”
Anurag Kahol, CTO atBitglass:
“What is especially troubling about this data leak is the seemingly blasé response from theOklahomaSecurities Commission. Leaving a database containing highly sensitive information unprotected and publicly accessible is careless and irresponsible; additionally, the agency is worsening the situation by failing to address the issue directly with the public. While all organizations need to defend their data, government agencies, in particular, must adhere to the highest of security standards – the type of information that they collect, store, and share demands it.
These kinds of leaks can have lasting consequences for all parties involved. To prevent such breaches, all organizations, including government agencies, must adopt modern security technologies. Dynamic identity and access management solutions, for instance, can verify users’ identities, detect potential intrusions, and enforce multi-factor authentication in a real-time, step-up fashion.”
Carl Wright, CCO atAttackIQ:
“Data leaks are often caused by gaps in security programs that can be easily prevented. TheOklahomaSecurities Commission’s leak of three terabytes of FBI data could have been avoided if they had visibility into the state of their defenses.
All organizations, including government agencies, must take a proactive approach to protecting sensitive data through continuous evaluation of their security controls, processes and people to uncover and remediate gaps that could be compromised by threat actors.”
Jonathan Bensen, Interim CISO and Senior Director of Product Management atBalbix:
“Leaking three terabytes of the FBI’s data due to leaving a server unsecured without a password is a critical error and indicates the need for theOklahomaSecurities Commission, as well as other government agencies, to strengthen their current security measures to ensure future breaches can be avoided in the first place.
Leaving a database containing such critical information unsecured is an elementary mistake for which there is no excuse. That said, organizations are increasingly struggling to maintain continuous visibility of all of their assets and successfully monitor the growing number of potential threats. Monitoring and analyzing the attack surface Analyzing and improving enterprise security posture is simply no longer a human scale problem. To best combat these threats, agencies must implement security tools that use machine learning and automation to monitor their enormous attack surfaces and vast IT asset landscape to proactively identify and address security vulnerabilities to mitigate the risk of future breaches.”
The opinions expressed in this post belongs to the individual contributors and do not necessarily reflect the views of Information Security Buzz.