According to IBM, 90% of cyber attacks are more or less enabled by human error. Therefore, it is very easy to call the employees the weakest link in an organisation's cybersecurity. We believe that we can change this narrative, and that they, if empowered and effectively guided, can become an exceptionally powerful line of defence for cyber security incident detection and response.
The fact is that every security incident involves a human decision in some way, such as:
- Whether or not to click on a link
- Whether or not to plug in a USB device
- Whether or not to use a strong password
- How to treat someone who has made a mistake
Each of these decisions can either improve an organisation's security posture or expose it to vulnerabilities. Unfortunately, it's a big misconception in cybersecurity that people are always objectively rational and that more training and phishing tests are the magic answer.
1: Trainings
Trainings have some value, of course, but they are flawed in that they assume that human decisions are based on knowledge alone; that if you know the right answer, you will behave in the right way. This is completely wrong! For example, it is almost universally known that obesity and tobacco products shorten life expectancy, and despite this knowledge, many people still make unhealthy choices.
As far as cyber security is concerned, people know the right answer in most cases, but they still have to overcome other internal and external factors to behave correctly. Some of these factors are time constraints, social pressure, complex processes, rapid changes, system constraints, but there are many more.
The mandatory annual training that has become the industry standard for HR or information security professionals to "tick off' the human part of the cybersecurity equation does not account for what motivates human behaviour.
They unanimously ignore the basic human need for autonomy and control, which leads to employees not taking the learning seriously, dismissing it as irrelevant and not using it as part of their own mental model for decision-making. This causes frustration, alienation, and a deterioration in sentiment towards cybersecurity in general resulting in widening the perceived gap between IT professionals and the rest of the staff. Even among the few who actually engage in these mandatory trainings, what is learned in the annual trainings quickly fades away due to the way human memory works.
2: Phishing tests
Another flawed industry standard is phishing tests, where system users are sent deceptive emails to determine how likely they are to fall victim to real attacks. In many cases, these tests are directly related to cyber training, and while measuring behaviour rather than underlying knowledge is of great value, this approach is flawed in two ways.
First, phishing tests are more a measure of how good an attacker is rather than how good an employee is at spotting them. Given enough time and resources, an attacker will almost always find a way to trick a person, and we know that even if a defender makes the right decision 99% of the time, the attacker only needs that one mistake to win the game.
But we also know that dwell time, the number of days an attacker stays in an environment before either being detected or launching an attack, is often weeks or even months, and so what an employee does after clicking on a wrong link or downloading a malicious file is much more important than the act itself.
Secondly, there are unwritten rules between people in society that manifest themselves in a set of expectations we all have of each other:
- that others will look out for our safety in their actions;
- that others will not deceive us,
- that others will not knowingly seek an unjust outcome in their relationship with them, etc.
This psychological contract exists in every relationship, including that between employee and employer.
Phishing tests are inherently deceptive. When an employee is compromised by the test, they often feel that the social contract they have with their employer has been broken. This is a well-researched psychological phenomenon and often leads to disengagement and even destructive behaviour. It turns the nature of the psychological contract on its head: if it is acceptable for the employer to deceive the employee, then it must also be the case in the other direction. In cases where failed phishing tests lead to training, employees are much less likely to engage meaningfully with the content because of a negative attitude towards the deceiver, in this case their employer.
Another highly problematic effect also occurs. Phishing tests communicate to employees that cybersecurity mistakes have negative consequences, such as unscheduled time for training or social embarrassment if they are perceived to have been compromised. As a result, staff are likely to take steps to avoid being considered at fault for behaviour they can logically associate with phishing. This is extremely damaging to incident reporting, which is often the best possible defence against cyber threats.