Maksim Kabakou - Fotolia

Security Think Tank: Hero or villain? Creating a no-blame culture

The traditional picture of a hacker is of a script kiddie in a hoodie hunched over a computer keyboard, but this stereotype is stale and outdated. Is it time to move away from a fear-based approach to security?

We have all seen the pictures of “hackers” hunched over a monitor in their black hoody exuding malevolence, but this is not what a cyber criminal, or advanced persistent threat (APT) actor, really looks like.

True, I have known incident responders with goatees and pigtails, but they were the exception. In reality, the people who mount cyber attacks and those who work to stop them are like anybody else. Possibly more male than female and more are younger than older, but that is changing.

These days, pretty much anybody can mount a cyber attack and everyone has a role in defending against them.

It is often said that users are the biggest risk, because they are likely to click on a malicious link in an email and unwittingly download malware compromising their PC. The attacker may send your company 100 emails and only one user needs to click on that link.

But look at this another way. If users have some awareness and the ability to report suspicious emails quickly, it only takes one of those 100 users to report the email and the tables are turned. The other emails can then be quickly identified and the attack responded to before any damage is done, even if any other users did click.

But to get to this point, you need to think of users as having a security role and enabling them to perform that role. This comes down to giving them the tools and motivation to do that job and stopping blaming the user.

Air travel, for example, has become a lot safer over the last 15 years or so, largely because of the creation of a no-blame culture and encouragement for everyone to report incidents and near-misses in the knowledge that there will be no blame attached and their career will not be affected.

This has given pilots, air traffic controllers and others the confidence to report their own and others’ errors and minor incidents, allowing common issues to be identified and fixed.

In the general business IT world, all too often the end-user is identified as the point of blame for an intrusion, resulting in a culture of fear with people afraid to report anything suspicious, especially if they have clicked on a link they shouldn’t have. If there is one thing we should have learned, it is that nobody is immune to social engineering. There are numerous examples of security experts and senior managers of security companies being duped, so we must accept it is going to happen.

Just as in the aviation example, this comes down to education and appropriate reporting mechanisms. Reporting must be easy, quick and provide positive feedback. Ideally, for phishing emails there should be a button to click to send the suspicious email to an automated analysis, which gives the user instant feedback on whether the email was safe or not and which automatically alerts the security operations team of any unsafe email.

For other suspicious activity, feedback could be via a web portal linked to a ticketing system. This should always be simple and quick, encouraging feedback to the user both on submission and, where appropriate, on closure of an incident. Anything complex and time-consuming will put off users who have their main job to do.

In most cases, users also need to be educated on what to look for, how to react, and why it matters. Phishing email training tools can be useful, but this must not be seen as “catching out” the users, or as a threat. Also, the emails must be varied over time, otherwise you are just teaching users to detect one style of phishing email.

Other training around passwords and web browsing is often best done in a home context, where the threat is to the user’s own assets or finances. This delivers a message that is more likely to stick about the consequences of an attack and how to avoid them.

In the end, though, security culture will not be changed just by campaigns and training. Generally, people follow the example of their colleagues and if breaking the security rules is the done thing, then the rules will get broken.

This is influenced by many factors, from the environment to how individuals in that area interact with each other and their seniors. Instilling a no-blame security culture and encouraging the right behaviours will take time and requires building of trust.

Usually, the worst thing to do is to lock down systems so tightly that users can’t do their jobs. This encourages rule-breaking. It is much better to let users do what they need to do, warn of risky behaviour and let them know that their actions are being monitored.

Popping up a warning that copying that confidential file is not a good idea, but allowing them to do it if they need to, will make people think twice about what they are doing. This is a much better learning outcome than that for an elated user, triumphant after they have managed to extract the same file against the rules from a locked-down system. 

If someone breaks the rules to get the job done, this is a failure of the system, not the individual, and they are probably not about to tell you about it. 

As a general rule, don’t blame people who honestly get things wrong. Listen and respond positively to reports of suspicious incidents and provide feedback. Peer pressure should eventually call out those who just won’t behave properly.

Read more on Security policy and user awareness