fabioberti.it - stock.adobe.com
Phishing tests are a useful exercise, but don’t overdo it
The vast majority of cyber attacks start with a phish, so it’s not surprising that phishing tests form part of cyber training plans. But sometimes these tests go too far. Cyberis’ Gemma Moore looks at how to avoid the pitfalls
Last year’s Verizon data breaches report found that 80% of all cyber security attacks started with a phishing email enticing the recipient to click on a link, open a document or download a file, so it is no surprise that companies want to raise awareness and test how good their employees are at spotting fakes. And as cyber criminals get more sophisticated at social engineering, simulated attack exercises need to accurately emulate the tactics, techniques and procedures used by the real attackers.
The best hooks for a phishing campaign tend to be those that will have the most emotional impact on the target that play on our fears or anxieties, or provide hope, reward and an incentive.
During the pandemic, the cyber criminals have not held back and have taken full advantage of the crisis with offers of fake payments, discounted Covid-19 tests, PPE and vaccinations. At the coronavirus’s peak in 2020, Kaspersky identified more than 5,000 pandemic-related phishing websites.
But when it comes to designing phishing tests to see how good your staff are at spotting fakes, where do you draw the line? West Midlands Trains hit the headlines last year, accused of overstepping the mark. The company emailed about 2,500 employees to thank them for their hard work over the pandemic and offering a one-off payment as a reward.
But those who clicked on the link to read a message from the managing director were emailed back with a message telling them it was a company-designed “phishing simulation test” and there was to be no bonus.
That incident inevitably generated much criticism and debate – not least among the companies and internal security teams that run these simulated exercises. Every hook we use for a social engineering test has a chance to upset a person who is targeted, so there must be a balance between the realism of a simulation and the obligation to protect employees from stress or harm.
Morally, we have a duty of care to those we target, which means we should minimise distress. But it’s a grey area that cyber security professionals have to navigate with their customers.
Rather than asking whether phishing tests like the one run by West Midlands Trains are morally wrong, it may be better to ask whether they are counterproductive. Cyber security is a whole-business problem, not just an IT problem, and the best security outcomes are achieved with a positive and engaged workforce. If these exercises can result in employees feeling upset, aggrieved and less valued, then ultimately this has the potential to damage overall cyber security resilience.
Of course, the counter-argument is that cyber criminals have no qualms about which emotions they will play with to achieve their ends – which can be devastating for companies and their employees. Social engineering is all about lying to people, abusing trust or betraying relationships. These are tactics adversaries use without consideration for their victims, and if we are to accurately simulate the attack chain, we need to adopt these techniques as well.
Although this is a valid point, it is not in any of our interests to lose sight of the wellbeing of our human victims when simulating advanced adversaries. A better way to get that message across to the user population is to educate them about the types of psychological pressure that criminals tend to apply.
Even a zero response rate on any type of phishing exercise is no reason to expect that your staff are safe from the risks of social engineering from a determined adversary. It can be a helpful metric to measure, but an attacker only needs a single foothold into an organisation to cause chaos. Real resilience lies in defence in depth and a layered suite of controls.
Breaching trust with employees by neglecting the care of those who have fallen victim to social engineering attacks can damage the very relationships that you rely on to protect your assets and data. While social engineering is something that must be done to support attack chain simulation, we must be mindful of the human cost and take steps to ensure damage to victims is minimised. This is a landscape painted in shades of grey, but we forget the humanity of our targets at our peril.
Gemma Moore is a director at Cyberis