Successful attempts by hackers like Lizard Squad to take down major websites have become so common that barely a week goes by without another victim. Recently, the Thai Government, Reddit and the National Crime Agency’s have been targeted by Distributed Denial of Service (DDoS) attacks.
The number and intensity of DDoS attack are rising in 2015. For businesses reliant on their websites this is an alarming trend. But it is not a surprising one given that web application security is often poorly understood and incorrectly deployed.
A DDoS attack can take different forms, but essentially it involves a website’s servers being hit by a large number of‘requests’which disable or significantly reduce the performance of the website. To co-ordinate and deploy sufficient requests, an attacker will have access to an army of compromised systems, made up of the systems of ‘normal’internet users infected with malware. A recent and concerning development is the availability of DDoS armies for hire by the hour.
Law enforcement agencies have not had much success fighting this growing area of cybercrime. DDoS attackers commonly use‘bulletproof hosting’for their own control servers that command their DDoS attack army. Such services often operate from countries that provide immunity against western law enforcement.
Businesses therefore need to make sure they are taking appropriate steps to protect their websites from a DDoS attack. For most, this will mean buying in a security solution and there are some common pitfalls that can be easily avoided.
Many businesses make the most fundamental mistake of all: attempting to secure their web applications with the wrong technology. A network firewall can protect Layer 4 protocols and even do deep packet inspection. But truly protecting against web application layer attacks generally requires terminating the HTTP or HTTPS protocols and often rewriting traffic to identify and mitigate threats. Just as a network firewall is not designed to stop spam, it is also not designed to stop web application attacks. This type of misunderstanding leaves the web application exposed, and gives the administrator a false sense of security. A web application firewall is much better suited to combatting DDoS attacks.
Key to any DDoS protection is the ability distinguish real users from malicious requests so that suspicious traffic can be blocked or challenged. But this is not easily done. One effective screening method is integrated IP reputation intelligence that contains real-time insight and historical intelligence. Be warned though that this only works if the reputation criteria are updated frequently enough to combat against new and emerging threats.
It’s also worth considering some form of dynamic client fingerprinting as part of any DDoS solution. Mechanisms that can detect suspicious clients using script injections and challenge suspected malicious requests with a CAPTCHA test can be a lifesaver when a DDoS army is very distributed, stays below the rate control radar, and its user systems have not been blacklisted.
One final consideration is whether to hold DDoS protection in the cloud or on-premises. Typically, cloud-based services work by redirecting all the incoming traffic first to the cloud via DNS manipulation, scrubbing the traffic, and then relaying it to the destination server. Such solutions promise easy setup and low maintenance. However, it is worth bearing in mind that persistent attackers can also bypass the cloud layer and target your servers directly, so an on-premises solution can be indispensable.
While we are going to see more high-profile DDoS casualties over the course of the year, I am predicting a far stronger response from administrators as they fight to build in effective security around their web applications. The first and simplest action to be taken here is to use the right tool for the job. A network firewall will leave web applications exposed so be sure to opt for a web application firewall to combat DDoS. Also look for a DDoS solution that uses IP reputation intelligence, but make sure its reputation criteria is frequently updated to avoid becoming obsolete. And for that extra layer of confidence and security, consider fingerprinting and robot testing.[su_box title=”Wieland Alge, VP and GM, EMEA at Barracuda Networks” style=”noise” box_color=”#336588″]
The opinions expressed in this post belongs to the individual contributors and do not necessarily reflect the views of Information Security Buzz.