Insider threats have always been a major issue in cyber security. The repetitive nature of certain aspects in today’s work, receiving and opening E-mails throughout the day can result in a lax attitude towards a secure behavior, by not paying enough attention to every individual action one performs. The massive amount of Phishing and malware campaigns are a testament to how well these attacks still work, no matter how much time is spent in security awareness training. Despite the security industry’s best efforts, it’s hard for employees to be vigilant all the time and have the insight on how their actions can expose the organization to risk. The current situation with COVID-19 only enhances insider threat risks, as the mass of employees have moved to working remotely, where the organization and its security teams have even less control over user behavior.
So how do we ensure, especially now, that employees are behaving in an appropriately secure manner at all times? The solution is not purely technological. There’s no way around the fact that at the end people are interacting with a security solution, and if it is not well-designed people will find creative ways around it, exposing the organization to risk in the process. To improve security, we must design the solutions in a way that influences the employee to a certain behavior, not just limit their ability to perform a certain action.
To understand how to influence one’s behavior we must first understand how people behave. Thankfully, there’s an entire practice dedicated to such research outside the realm of cyber – Behavioral Economics. According to Wikipedia, behavioral economics “studies the effects of psychological, cognitive, emotional, cultural and social factors on the economic decisions of individuals and institutions and how those decisions vary from those implied by classical theory”. In other words, behavioral economics studies the inherent biases that influence our decisions. People are not computers, decision making isn’t purely rational, it is also psychological.
Understanding these biases means that we can design solutions that address them and as a result promote the right, secure, choices. Thankfully, such design decisions don’t necessarily require a lot of resources. In fact, many researches in behavioral economics have shown that this can be achieved in what is normally an afterthought in security products – the simple act of changing the wording of certain sentences can be enough.
Let’s use for example the issue I’ve noted above, employees opening suspicious attachments. This is a huge risk to organizations, as sending malicious attachments is a tactic often used by various threat actors, from cybercriminals to nation-states. We can turn to behavioral economics to understand why this tactic often works. Behavioral economics teaches us that people have an inherit optimism bias and overconfidence in their knowledge and abilities (“this will not happen to me”). In our case, this overconfidence is amplified when an employee is used to opening legitimate attachments for years, without any repercussions, known as a “status quo bias” – people’s tendency not to change behavior unless the incentive to do so is strong.
To prevent employees from opening such attachments, some security solutions insert a warning into all E-mail arriving from external sources. However, against the aforementioned pre-existing biases, which stack up in the favor of the attacker, the warning security solutions insert is often generic – “[EXTERNAL EMAIL]” or something of the sort. While it serves the purpose of reminding employees of the nature of the E-mail, it does not address any of the biases, nor attempts to manipulate other ones to try and nullify their effects. Thus, the generic warning’s message is diluted.
An improved warning in this case, for example, would be to present the user with statistics – what is the chance that the attachment they are going to open is malicious (addressing the optimism bias and overconfidence effect), or provide information on how much it would cost the organization if they were to be exposed by a cyberattacked triggered by the employee opening the attachment (addressing the status quo bias, by giving a powerful reason to change their behavior). A warning that changes and is derived from the attachment type can ensure that these warnings would also be addressed and not overlooked. It may sound silly, but even requiring employees to customize their warnings with colors of their choice (from a selection of visible colors) can help make sure they are adhered. This addresses the “Ikea bias” – an effect that says that people love and respect what they’ve built themselves. Having them involved in choosing how the warnings would be shown could address this bias to some extent.
Embedding warnings in E-mails is only provided here as one example of how proper wording can address inherent biases. This concept applies to wording in other security products, as well as any security-related communications to the employee. Wording is not the only aspect of security product design that can be used to address employee biases, nor is behavioral economics restricted to the design of security products, it can help shape all aspects of security that employees have interaction with. Behavioral economics studies what makes people behave in a certain way. In a field that is hugely affected by user behavior, it would be wise to incorporate it more to improve overall security and reduce risk.