What makes for a successful cyberattack? Technology is one part of the picture, clearly. Malicious cybercriminals make use of a wide range of tools and techniques to gain access to their targets’ networks, probing for vulnerabilities and infecting systems with malware. Businesses processes are another key part, whereby attack perpetrators look for exploitable elements in organisations’ procedures – loopholes, duplications and neglected areas.
But a third part of the cyberattack picture – and one that is often underestimated by organisations – is psychology. Successful cybercriminals typically have a sophisticated understanding of the psychology and behaviour of the people in the organisations they target – and they know precisely how to exploit them.
Successful cyber defences, then, depend on responding to all three elements of this picture. They depend on having the right technologies in place to combat those harnessed by the attackers. They depend on tight processes, regularly reviewed and updated. And they depend on understanding the psychology of how cyberattacks work, training and educating staff accordingly and understanding some core human behaviours which can be capitalised on for implementing good security practice.
Let’s explore some of the core aspects of the psychology of cyberattacks which will help you shore up your defences.
Authority bias
Authority bias is a well-established cognitive bias, whereby people have a ‘tendency to attribute greater weight and accuracy to the opinion of an authority figure’. In business settings, this means that junior members of staff are likely to trust the opinions of their managers, and potentially avoid openly contradicting a decision or judgement they may find questionable.
Clearly, there are many contexts where this is right – it is the process by which junior members of staff learn. But it can also be harnessed by cybercriminals. Consider a junior member of a team receiving an email purporting to be from a senior manager, requesting the login details for a key asset. Authority bias may encourage that junior individual to cast aside concerns about the legitimacy of the email in favour of doing what is asked of them by a senior member of staff.
Alternatively, consider a junior member of staff receiving a direct, face-to-face instruction from a senior – but an instruction which they suspect does not follow cybersecurity best practice. Here, authority bias means the junior is likely to avoid speaking up, and will instead discharge the instruction regardless of misgivings. In both cases, the result could be a damaging cyberattack.
The fear factor
Ransomware attacks have been on a massive upward trajectory in recent years, with one study in 2019 reporting a huge 363% year-on-year increase in the number of attacks targeting businesses over the first half of the year. Such attacks work on the basis of panic – people who believe they are about to lose access to business-critical applications or data are far less likely to pause and think clearly, and far more likely to make quickfire decisions which go against best security practice. Such as agreeing to pay attackers a ransom.
Availability bias
Another well-established cognitive theory, this underlines that humans are more likely to think that examples of things which come readily to mind are highly representative – regardless of whether they actually are. A classic way of illustrating this is to point out that statistically people are more likely to be killed by a vending machine than a shark – yet which one frightens them more?
In the realm of cybersecurity, this might mean, for example, that individuals within an organisation tend to think that the typical cybercriminal is a teenager sitting in a darkened bedroom, probing the organisation’s perimeter for vulnerabilities.
The reality of cybersecurity, of course, is that the majority of cyberattacks and data breaches are ultimately due to human error and carelessness – whether independent incidents such as misconfiguration of key security tools, or individuals falling victim to social engineering techniques and accidentally handing over key credentials. In other words, people should look to themselves as the (potential) weakest links in the organisation’s security posture, rather than focusing outwards on malicious cybercriminals.
Cybersecurity best practice: a psychological approach
How, then, can organisations learn from these psychological theories and realities, and harness them to develop truly comprehensive approaches to cybersecurity?
Defending against authority bias requires organisations to cultivate cultures where junior are not merely allowed, but actively encouraged to question their seniors and raise queries where cybersecurity is concerned. The culture needs to be welcoming of such interventions, rather than making juniors feel silly for doing so.
Defending against the fear factor requires careful education and training around keeping calm in the event of a shocking or alarming security incident – as well as clear lines of escalation to more senior members of staff. Additionally, it requires clear safety nets in place – disaster recovery and back-up systems which staff are kept well informed of.
Defending against availability bias is another education and training point. Staff should not be left to jump to media-inspired conclusions as to what the greatest security threats to their organisations look like – they should be empowered with accurate information and a clear understanding of their own role within the organisation. You should also think about the importance of different personnel within your organisation understanding how their colleagues behave and think. All too often, individuals who have made an error – clicking on a potentially dodgy link, or sending their login credentials over email – will keep quiet about it out of fear. If they are made aware that everyone makes mistakes sometimes – but the crucial thing is escalating and responding to it openly – then your organisation is far more empowered.
Finally, it is worth considering a rather more positive aspect of psychology and cybersecurity – social influence. This is the notion that people are encouraged to strengthen and improve their own processes and procedures when they see others doing the same – a form of social contagion. Organisations can harness social influence, by say, setting up tools with messages such as ‘change your password – XX% of your colleagues already have.’ Reminders of good security practice which draw on colleagues’ behaviours and the responsibilities your staff hold towards each other can be a powerful way of harnessing the psychology of cyberattacks.
The opinions expressed in this post belongs to the individual contributors and do not necessarily reflect the views of Information Security Buzz.