The amount of encrypted traffic on the Internet has grown at least twice as much in the past year[1], and this figure will continue to grow at even faster rates as new protocols such as SPDY and HTTP 2.0 are adopted. Web application traffic is encrypted by SSL (Secure Sockets Layer) or by the successor protocol, TLS (Transport Layer Security). With every email message, Tweet, Facebook post, and one-click order encrypted in-flight, the question emerges: How effective are the encryption methods used to relay data across the Internet?
The effectiveness of SSL or TLS encryption is largely dictated by a chain of trust (established by Root Certificate Authorities or CA’s), browser capabilities, and by the behavior of users. Many have argued that the CA’s have failed us, as we have no practical way to verify if a CA has been compromised. Some proposals have surfaced, such as Moxie Marlinspike’s TACK project[2] and CA Browser Forum initiatives[3], but at present, we are stuck with a system where the authenticity of a website’s identity is squarely dependent on the trustworthiness of a few CA’s.
The problems of user behavior and browser capabilities are intertwined. The citizens of the Internet are mostly habituated to all manner of certificate warnings frequently raised by a browser. Many ignore these warnings if it is a site they “trust”, presuming it to be some transient system error. In fact, phishing attacks are heavily leveraged against the average user, who is at this point utterly desensitized to such warnings. The warnings raised by browsers are typically just the obvious errors such as a site hostname not matching the certificate or an expired certificate, not more insidious forms of undermining authenticity.
One example of this more insidious type of error relates to the recent Heartbleed vulnerability found in the OpenSSL implementation of encryption, whereby a site’s private key could be forcibly exposed. One of the Heartbleed remediation steps was certificate revocation, which is maintained by a Certificate Revocation List or CRL. CRL’s are being superseded by Online Certificate Status Protocol or OCSP. However, in practice, both CRL’s and OCSP have proven ineffective, mostly because browsers either do not verify against CRL or OCSP, or because the information isn’t updated in a timely fashion.[4] So, even where there are mechanisms for a CA to assert authenticity (or lack thereof) of a particular site, either user behavior and/or browser mechanisms render them pointless.
In time, browser implementations will improve so that the status of a certificate will be more heavily scrutinized and more transparent to the average Internet citizen, whether via OCSP, CRL, or the aforementioned TACK proposal. Some browsers, such as Google Chrome, employ the tactic of “certificate-pinning.” Certificate pinning ensures that certain sites (such as all those in *.google.com) only come from a pre-validated source defined in a bundle included with the browser itself. This mechanism overcomes the lack of trust in some CAs but obviously has scalability challenges in terms of full coverage of Internet sites.
Many browser implementations are also catching up to modern implementations of TLS, with most supporting TLS 1.0 and moving rapidly toward broad adoption of TLS 1.1 and 1.2. The TLS protocol is less vulnerable than SSL 3.0 and below. However, most browsers offer no easy discernment or deterministic option to prefer these strong encryption protocols. Therefore, the burden relies squarely on the web site to enforce “good encryption” of the connection.
Ironically, the most important trend in browser technology might be the simplest: auto-updating. With the undead corpse of IE6 nearly eradicated, modern browsers are free to advance security features for the bulk of the Internet population in non-disruptive fashion.
However, browsers and users can only do so much to ensure trusted connectivity. Organizations must work to maintain and enforce standards to strengthen authenticity on their web properties, as well. All is not lost, however. The financial services industry (FSI) and government agencies have long been living under the Federal Information Processing Standards (FIPS). Among other facets, FIPS describes what “good encryption” is. Guidance is included for elements such as cipher selection, TLS version, and private key storage. For example, the use of Hardware Security Modules (HSM) for key storage prevents vulnerabilities such as Heartbleed from being exploited, even if the SSL stack is based on a vulnerable version of Open SSL. There are also many useful tools for evaluating the encryption utilized on any given web site, such as the Pulse tool found on SSLLabs.com. With the rising awareness around encryption standards, and improving browser support, maintaining a “trusted Internet” is not as impossible task as it might seem.
Footnotes
[2]
[4]http://news.netcraft.com/archives/2013/05/13/how-certificate-revocation-doesnt-work-in-practice.html
[su_box title=”About Brian A. McHenry” style=”noise” box_color=”#0e0d0d”]
As a Security Solutions Architect at F5 Networks, Brian McHenry focuses on web application and network security. McHenry acts as a liaison between customers, the F5 sales team, and the F5 product teams, providing a hands-on, real-world perspective. Prior to joining F5 in 2008, McHenry, a self-described “IT generalist”, held leadership positions within a variety of technology organizations, ranging from startups to major financial services firms.
Twitter: @bamchenry[/su_box]
The opinions expressed in this post belongs to the individual contributors and do not necessarily reflect the views of Information Security Buzz.