Online privacy is no longer about simply staying away from prying eyes. Encryption on the web plays a key role in affording us our privacy and it is constantly changing.
Once reserved for login and checkout pages, cryptographic protocols like Transport Layer Security (TLS) have risen in prominence in recent times, providing a way for endpoints to be authenticated and communicate confidentially. Over the last two years, TLS standards have been updated too. Browsers will soon block old and weak implementations of the protocol, such as TLS 1.0 and 1.1. New protocols are also being introduced to lock down insecure protocols still used widely today.
It is easy to see how businesses might struggle to keep up. Early last year, for example, security researchers uncovered the first-ever malware sample making use of the emerging Doman Name System encryption protocol, DNS-over-HTTPS (DoH).
The TLS landscape isn’t what it used to be, and organizations need to stay on top of the latest developments to ensure that websites are securely deployed and maintained over their lifetime.
Keeping Pace with Change
For the most part, businesses understand the benefits new TLS updates can bring and they appear receptive to change. For example, F5 Labs’ latest TLS Telemetry report found that almost a third of the Alexa top one million sites were accepting TLS 1.3 connections. In many ways, it makes good business sense. Most popular web browsers already support the new standard and it brings with a range of security and performance benefits.
However, adopting the latest TLS is not an option for everyone yet.
If this is the case, companies should consider disabling RSA key exchanges altogether by removing cipher suites that offer it. F5 research found that over a third of the world’s most popular websites still offer RSA as their preferred cryptographic algorithm—even though security researchers are still finding ways to attack it. This includes the 19-year-old ROBOT vulnerability that allows performing RSA decryption and signing operations with the private key of a TLS server. According to F5 Labs research, just over 2% of the world’s top sites are likely still vulnerable to this exploit.
As with other software, security researchers often discover vulnerabilities in TLS libraries. This is why the onus is on organizations to ensure they are alerted when their web server, load balancer or application delivery controller has updates to its TLS stack. Policies for rapid patching are vital too.
It is also important to note that any Certificate Authority (CA) can create a certificate for absolutely any domain on the web. For this reason, it is prudent to restrict permission to only two or three well-known and trusted CAs. This can be achieved by creating DNS Certificate Authority Authorisation (CAA) records. Additionally, applying the HTTP Strict Transport Security (HSTS) header to web apps will provide an extra layer of security, as browsers will only ever attempt to load a site securely over HTTPS. It is a crucial step that can help prevent network attacks that force insecure pages to load, allowing attackers to snoop, rewrite and redirect network traffic.
Although DNS CAA records were created to prevent mis-issuance of certificates for valid domains, fraudsters rarely attempt to create a certificate for an existing domain, such as mybank.com. Instead, they create certificates for domains which they own using the brand name ‘mybank’ as a subdomain, such as mybank.attacker.com.
Thankfully, every time a certificate is created by any CA, it gets recorded in a globally distributed database (the Certificate Transparency logs). The monitoring of CT logs is a useful way to be alerted when a domain or brand is being impersonated by threat actors.
With HTTPS now everywhere, there are also more ciphers, keys and certificates to manage. Combined with the increasing adoption of DevOps methodologies, this means that the speed of change and deployment are constantly increasing.
Just as security tooling and testing are being integrated into the automation toolchain, so too must the configuration of HTTPS. This means looking at the orchestrated creation of digital certificates and developing internal policies that define the standards, such as minimum key length and cipher suites. Automation of this nature will also help with certificates that are due to expire, renewing them automatically before a service interruption can occur.
Minding the TLS Security Gap
Unfortunately, many privacy and security gaps still exist, even when TLS is deployed correctly. Protocols, such as DNS-over-HTTPS (DoH), are emerging to help close these gaps and, while they improve privacy for users of the web, they can also make it harder for enterprise security teams to identify and block malicious traffic. In some instances, this calls for disabling DoH for enterprise networks or deploying internal DoH services for your users. These services will work with your web proxy and help filter out unwanted traffic.
At the end of the day, even the best TLS deployment in the world cannot prevent malicious code from being injected by client-side malware or compromised due to third-party scripts. That is why we always recommend gaining an understanding the limits of HTTPS and what gaps are present.
One thing is certain: encryption is always evolving. Key lengths are increasing, certificates are becoming automated, governments are imposing restrictions, and new protocols are emerging. It is this constant change that poses a new degree of risk to many organizations and their customers. Poor TLS deployment will not go unnoticed by hackers, regulators, and cybersecurity insurance companies, and it can raise serious questions about the rest of an organization’s infrastructure.
The opinions expressed in this post belongs to the individual contributors and do not necessarily reflect the views of Information Security Buzz.