For our first 2022 interview with a specialist, we speak to Sarah Janes.
Taking a departure from our usual view on Cyber Security leaders, today we look at some of the psychology and training behind cyber security. Sarah is the Owner and Founder of Layer 8, an organisation that strengthens business and societal resilience to cybercrime by empowering people to act as teams of custodians through its (soon to be accredited) Security Champions programmes.
This concept: I told someone about the risks. Therefore they’ll change behaviour.
Just knowing what the correct behaviour does not automatically translate into changing. If this were true, we’d all be super fit and healthy!
More needs to happen than just knowing what you should do for a change to occur.
I like to explain this using the BJ Fogg behavioural model. He says behaviour change only occurs when people are motivated, they can perform the behaviour and receive a prompt. When all things happen in the right way, and at the right time, we can start creating habits.
I don’t think it has to start at the C-suite.
But it is certainly a faster route to change. It also depends on the type of organisational culture you work in. For example, I have seen highly collaborative learning cultures where a group of enthusiastic Champions has driven change. They have started small and driven change outwards and upwards. However, in a hierarchical, autocratic culture, this would not work, or it would take too long.
For those organisations, change must be driven from the top.
A simple definition of culture is what people say and do to demonstrate what’s important to them, and we do look at leaders to influence our behaviours, so yes, if you can, start at the top!
Yes, if they are positioned and measured correctly.
Phishing programmes should be positioned as learning, e.g., ‘we are trying to learn what types of phish we are most susceptible to, so we can help our colleagues understand and protect against them.’
Reprimanding triggers the wrong behaviour; fear can paralyse people into doing nothing.
What we want is speedy reporting.
Suppose we measure the speed between someone identifying a potential phish (whether or not they have clicked the link) to the time it’s reported. In that case, it encourages the right behaviour: report quickly. Meaning the business can do something useful (e.g., focus on reducing the impact of the attack or warning people to watch out for a phish that is doing the rounds).
Obviously, we’ve had to move many things online over the past two years. But it’s important to remember online does not need to mean one way. It is still critical to interact, collaborate and engage people to discuss security.
Conversations are our catalysts for change. Conversation provides context meaning and allows us to ask questions.
We’ve adapted to collaborating via technology for other areas of work, and security awareness and training should be no different. We’ve supported many organisations to get their Security Champions programmes working effectively over the last few years. Because Champions, who sit within a function, already understand their team and know-how to collaborate with them.
Also, I love your question, but can we call them people ?
We often refer to people as humans or users, which sounds really unnatural as part of a normal dialogue, e.g., we wouldn’t say ‘I was talking to a group of humans last night.’
Seriously though, it’s our job as security professionals to provide the confidence and motivation to help behaviour change occur. We need to use positive motivational language that helps people believe their contribution will make a difference; otherwise, if they’re just the weakest link, then they could argue, ‘what’s the point!’
You have to get specific.
We often start with identifying the moments that matter. A moment that matters is a point in time where a person interacts with a system or piece of data, and their behaviour choice at that moment matters. Once you have defined the moment and the ideal behaviour, you can start identifying what people need to know, what support/tools they need, prompts they need, etc. From here, it is much easier to spot opportunities to measure. For example, if a moment that matters is choosing the correct information classification, you could measure the adoption of the classification tool.
For most of our Champions clients, though, it is the continued ebb and flow of intelligence from the security team into the business and from the business back into security. This way, security can adapt its tools and processes to better support and protect the organisation.
It is essential and links to my first point, the misconception that if you tell someone about the risks, they’ll change behaviour.
We need to have a basic understanding of psychology to:
A – understand the best ways we can interact with our workforce. For example, how do they learn, why are new approaches adopted or not adopted. How can we nudge people to make the right choices?
B – know what the cybercriminals know. Social engineers are masters at manipulating people’s emotions. Most training and awareness targets people’s knowledge. But being able to use knowledge as part of a rational thinking process happens in a part of our brain called the prefrontal cortex. But most of our actions (that are not habitual) happen in our limbic system, meaning they’re auto-responded to by our emotions. This is why as security professionals, we can sometimes perceive that people are the problem because ‘they know what they should do, but just don’t do it.
This is an interesting question and links to my previous point about most training being knowledge-based. This means people may know what they should do and could probably identify the right behaviour in a quiz.
But performing the right behaviour under pressure may get a different result. By pressure, I mean when being socially engineered; this could also be just having a busy day. Knowledge is foundational. However, there needs to be opportunities for people to practice through discussions or simulations. People need to know who to go to for support. Importantly, there must be an open environment to report and discuss when things go wrong, using each incident as an opportunity to learn together.