Session Video
Session Transcript:
Dinis Cruz - 00:00 Hi, welcome to this open Security seminar session in April 2023 and we’re going to have a very cool panel around behaviour change and awareness. It’s definitely a topic that I’m quite passionate about. I think there’s a lot of things that are related to it and a lot of activities that we can do that make a big difference. So I’m Dennis Cruz. I’m the chief scientist of glass. All I’m also the CEO of Holland Barrett, and I think it’s about creating a system and then everything tends to fall into place with it, but also the reaction and the response element of it. I’m looking forward to explore maybe tim, over to you.
Tim Ward - 00:40 Yeah. My name is Danusia Rolewicz and I specialize in human factors, risk analysis and management, and particularly security awareness and culture change. I suppose our whole business is all about trying to take a more behavioral approach to security awareness and move us away from those traditional approaches where really it’s just a bit of a tick box exercise, but we can come back to talk about that in a second.
Danusia Rolewicz - 01:02 My name is Donna Shara Levic and I specialize in human factors, risk analysis and management, and particularly security awareness and culture change. I am contracting at the moment around various organizations to improve security posture and security culture within the organization, most recently Apple. I’ve found that there is a shift in the industry at the moment towards more empowering and pathetic approaches towards staff, more employees, staff people rather than users. I think that’s having a really positive effect. I think the more that we focus behaviour change and empowerment of staff rather than using that tired, weakest link expression, the further we can get really.
Dinis Cruz - 02:03 Cool. All right, so who wants to define what do you mean by behaviour change and awareness?
Tim Ward - 02:11 Pikey, do you want to go first and easier?
Danusia Rolewicz - 02:17 I think we all know what behaviors we want to change, right? There’s a lot of talk about click rates, but really when you’re looking at behavior change, you are also looking at the behavior and the dynamic within the company or the organization, like trust dynamics. Rather than focusing on things like click rates, you would try to empower your staff, foster a culture of trust within the company, where rather than focusing on the click rate, you focus on the reporting rate. You give the credit and celebrate the wins for those positive outcomes and for those achievements that often fall by the wayside. When you look at a traditional security audit and the metrics that go with that.
Tim Ward - 03:08 I agree with that completely. I think also just going back to your point about there being a shift, I think we’ve traditionally done security awareness training. It’s just been a thing people did partly because they got that there’s a human element and we have to help people know how to deal with the threats. Partly just because it became a compliance checkbox. If you step back and think about why you’re actually bothering doing it, well, you go back to that kind of famous stat of 80, 90% of tax start with a human. If you actually want to reduce that risk, then awareness doesn’t reduce the risk. It’s just people are aware, great, well done. You’ve got to be trying to change behavior. You’ve got to be trying to build secure habits. It might be just reinforcing habits that are already taking place. I can see Janet just successfully come onto the call.
Tim Ward - 04:03 Because if you really want to reduce risk, then it does come down to different behaviors.
Danusia Rolewicz - 04:10 Yeah, that’s right. I can see Jeanette’s just successfully come onto the call.
Janet Bonar Law - 04:17 My sincere apologies. I am technologically incompetent and couldn’t find the right setting in zoom. I lived my life in teams, so zoom is not familiar to me.
Dinis Cruz - 04:34 Just give me a quick intro, Jeanette, and then give us your views on behavioral change and awareness.
Janet Bonar Law - 04:41 Okay, so I work as a lead specialist in people’s cyber risk management at the Coventry Building Society. We take a very broad view of what awareness is in the Coventry and we concentrate more on the end, which is that our function is to drive down risk through a secure behavior change. In order to do that, we’d like to think of ourselves more as human hackers. We try to get people to do the things that we want them to do by wanting to do them.
Danusia Rolewicz - 05:25 I think that’s a really good point, Janet. I think if you make things easy and accessible for people, then you’re more likely to get them doing what you want rather than taking a top down approach to controls and behaviors.
Dinis Cruz - 05:45 We can use science for this, right? Because there’s already data that shows that, like you said, the more the users engage, the more you make sense to them, the more their behavior is safe, the better the outcomes, the better the return on investment that we have.
Tim Ward - 06:03 I think there’s a kind of a self efficacy thing, isn’t there? If you kind of help people then get better at things, then you get that kind of positive cycle. I mean, I think one of the interesting because once you’re talking about behavior, it does make you go to the academic stuff and think about behavioral theories. One of the interesting one is protection motivation theory that has you thinking about how do you make a decision when it comes to dealing with risk. Part of that is around understanding the threat. That’s the only bit we’ve tended to focus on as awareness practitioners in the past. The other bit is the coping, like, do we know what to do? Can we cope? I think that’s the self efficacy bit. If we give people more information of how to cope with the problem. We don’t necessarily have to scare them quite so much a bit, maybe, to help them understand the risk, but we should focus more on the coping, like giving them an easy action to do.
Janet Bonar Law - 06:58 Yeah.
Danusia Rolewicz - 06:58 Also on the communication side, I think I’m muted. Am I muted? No. Also on the communication side of things, to really communicate what’s in it for them, rather than going in and saying, oh, well, we have this objective, we need to improve our posture by this or that statistic, how will this make your life easier? Why should you be doing this? I think a lot of security communication, forgets the why and also just the connection with people.
Dinis Cruz - 07:40 What I like about the word behavior is because ultimately what we want to create is a certain of behaviors in organizations. I’m also a big fan of taking incidents like P three S and P four S and elevating them to P ones. In fact, again, I was mentioning that the presentation before this one was really cool because the guy that presented it was showing really good examples of real world incidents who costed like from half a million to multimillion pounds to the companies. He was showing all the things that the attackers did. In a way, when you look at list, you see this is every one of them was a failure in behavior in the organization. Something should have detected that there should have been behaviors in the organization that are looking for this. They’re paying attention that look for this. I like the word behavior because we can measure that and we can basically say, look, we want the organization to behave in a certain way and we want the organization to behave in ways that then allow us to do X-Y-Z.
Dinis Cruz - 08:38 When you look at the real world data that either backs up or not the behavior that you want.
Danusia Rolewicz - 08:45 I think it’s interesting that you’re saying a failure in behavior, Dennis, because like Tim mentioned earlier, the Gartners of the world talk about 90% of incidents could be attributed to human failure. That I mean, humans are responsible for every point in the chain. It human failure to correctly configure a system? It human failure to perhaps misunderstand the culture of the organization and impose a policy or a procedure that’s probably unrealistic? Or is it the end user inverted comments that failed?
Tim Ward - 09:37 Yeah, I think it’s an interesting one, isn’t it, of that kind of risk of getting into that blame, starting to talk about repeat offenders and things like that when it comes to clicking links. I suppose we should recognize that we are giving people computers with email that has links in and half the time we expect them to click them and sometimes we don’t. So it needs a bit more thought. I suppose I like doing this point around measurability because I think that is quite important and that’s if you are thinking about behavior so old traditional awareness, all you can really measure is have people done it or not? And you get a completion rate. It’s like, well, what does that mean? That doesn’t tell you anything. Obviously, measuring incidents is far too lagging an indicator. With behaviors, you can do things like that. You can get really observable ones.
Tim Ward - 10:32 One of the things we’re trying to do with our piece of software red flags is to try and actually get right there and measure, well, how often are people doing things like plugging in a USB or uploading files? When you intervene, are you seeing that change? There are lots of ways to make this more measurable. As Deniser said earlier, start with reporting rates, not necessarily quick rates, because reporting is a really positive behavior. If you’ve got a culture where that’s a high number, then you’re really onto a winner. If not, start building up that as a focus.
Danusia Rolewicz - 11:04 If you have a Champions program or thinking of starting one, then there’s all sorts of things that you can use to join up the dots to the reporting rate. How has this changed? How many people have become involved in Champions programs? How many meetings have there been? How many times have they discussed security with their team? Join up the dots of all of those metrics and you can get an idea of why there’s been a change in reporting rates, for example, or other measurable behaviors.
Janet Bonar Law - 11:44 That’s a really good point. This idea that we have incidents because of some kind of single point of failure and this hideous metaphor, which I loathe about being the weakest link. Humans are the weakest link. Well, that’s not true. Humans exist in a whole ecosystem of vulnerabilities, some of which are behavioral, some of which are technical, some of which are cultural. It just so happens that in a particular incident, one step, because it is never a single point of failure, one set of events happened which highlighted weaknesses in the ecosystem. And we need to recognize that.
Danusia Rolewicz - 12:32 I think that’s a really good point, Jeanette. I think just saying the word weakest link or failure or whatever equivalent thereof, it doesn’t send the right message to the rest of the organization. Why would anyone want to think of themselves negatively? You would never engage anyone by telling them a negative thing about themselves. If we have that behavior ourselves, of using language like that within our team, then how likely is it that’s going to be perpetuated? We’re going to spread that message consciously or unconsciously through the organization and create apathy, really like, do we want to motivate or demotivate? Really discourage language is so powerful in that, isn’t it?
Dinis Cruz - 13:27 I think I could be a little controversial. I would say that 99%, if not 95% of what we call human error are system errors, right? We put the human in a place that it can create a huge amount of problems in a way that you don’t understand the side effects. So that’s a system error, right.
Tim Ward - 13:52 I was going to say one of the quite important things when you start to take a behavioral approach is to think about what you’re going to do with that. Because you could take a behavioral I mean, phishing simulations arguably are a behavioral approach, but they have a tendency to then result in a bit of a stick that says you failed, we’re going to tell you off, thing. What you need to be doing is thinking about what that data can tell you. To your point, Dennis, it can tell you that your business process is putting people in a position where the only way they can meet the deadline the boss is set is I know, Gmail, and send something to their Gmail account at home. The behavioral data can give you really interesting insights. It does take a bit of a mental shift, I suppose, to think about what are you going to do with that information?
Tim Ward - 14:36 How are you going to take a more behavioral approach so that it just doesn’t become a consequences culture where you’re just telling people off when they do things wrong.
Dinis Cruz - 14:44 I think it’s about what question you’re asking, right? For me, I always think that the users should have common sense, right? They should really understand how to report things that they observe that are a bit weird and they should be rewarded by positive behavior, right? It means that in a way, you want over reporting, right? Because in an interesting way, over reporting is a side effect of your users actually thinking, oh, this is a bit weird. Oh, this is not very good. Right in the middle of it, there should be the real scenarios, but also it gives you good data. It allows you to understand, are you an organization that has a huge amount of problem with that kind of activities, or which division of your organization has lots of problems with phishing or abuse and what other divisions don’t have, right? You can look it again at the incidents or the practices and see what do we have in place to prevent that?
Dinis Cruz - 15:42 I feel that again, I like the behavior because behavior allows us to look at incidents and going, why didn’t we detect this here instead of there, right? Who saw this in all the activities and didn’t raise an alarm bell that suddenly you only got raise an alarm bell because XYZ had happened, right? You can think about what technology you put in place and what solutions you put in place so that in a way, when something happens, you have immediate understanding and reaction about it because the user clicks on a link, the user doing an action, a stressed engineer doing something that makes a mistake. It’s not at fault. We put the engineers in that location, right? We put users in that environment. They not at fault, right. They’re just behaving if anything, we should be celebrating the other 99% of the times that they saved, but they didn’t do it.
Dinis Cruz - 16:38 Right. It’s we have this thing where we force the users to drive a car down the mountain, right, with no rail guards at a crazy speed, and then we complain about the ones that fell over, right? Actually, actually you should be celebrating the ones that made it true. Right. Because we gave them a really dangerous environment to operate in.
Tim Ward - 16:59 It does raise some interesting questions about how you think about security. You can have that traditional approach. Well, everything’s going to be really secure if you lock everything down and control it. That’s perhaps quite an old fashioned approach to security, but that was the way things work. We can lock everything down, it’ll be fine. Turn off the Internet, we’ll all be secure. Actually, if you’re starting to understand behaviors at the level you’re talking about Dennis, then you can start to go, well, actually, these people over here, the way that they work, they could actually have more human based controls. They don’t need to be locked down. We could just help them behaviorally and therefore your whole business could work. It could access these tools that will add to creativity rather than being that we’re a bank, therefore we turn off the internet. We turn off everything.
Tim Ward - 17:45 They can’t do their home stuff, they can’t plug in USBs. Obviously that’s going to have an impact on an organization getting talent to some degree. I think you can create quite an interesting kind of trade off and say, well, look, can we take a more behavioral approach? I think that’s going to be hard for security. Some security professionals will find that kind of, well, hold on with this control. This is really black and white. It’s really binary. They can either do it or they can’t.
Dinis Cruz - 18:08 I think you need to work first. I think you need to add behavior and risk to it because they are very connected, right. It also you need to take into account the threat agents and in a way the regulatory compliance that you might exist. Right. But the threats are important. The threat agent is very important because the threat agent determines the level of risk that you have. If you’re an environment that has highly sophisticated, super efficient, like they can act crazy fast, then yes, you probably need to have more security controls in place. Majority organizations, that’s not where we are, right? Majority organizations is criminals, is vigilantes, people who disgruntled employees, even the users. Right. The biggest attacker when you track this properly that the company has is the company itself. We just don’t call them cybersecurity incidents. Right, but the company itself will cause a lot of damage to itself.
Tim Ward - 19:04 Yes.
Dinis Cruz - 19:06 And that’s all behavior. Right. It’s all behavior because we put users in places that they can make a lot of those mistakes. And it’s not even mistakes. It’s almost a margin of error. Right. If you force a team to operate at a certain level of conditions, you will have, say, they operate well 90% of the time, okay, there’s going to be 1% mistakes. That’s it. You have to assume that is just almost the cost of that environment that you set up in place. Right. That your behavior there determines and your risk appetite determines what that percentage is.
Tim Ward - 19:38 Right, absolutely. I mean, I’d be interested in kind of Janet in your kind of world of commentary, kind of how you go about taking a behavioral approach. It about just walking through the risk register and saying, what’s the behavioral perspective of all of these things? Where do you start?
Janet Bonar Law - 19:58 It’s interesting, isn’t it, that Dennis mentioned how risk and response to threats varies across various pieces of an organization. For that very reason, in the Coventry, we’re moving to a more functionally aligned pattern of delivering security awareness for wanting a better word so that particularly high risk functions such as finance and the people team will have a dedicated awareness specialist who becomes almost part of the tribe. So it’s like being a cultural anthropologist. You go in and you find out how people are working and what it is they need to be doing and what support we can put in to take away as much of the pain points as possible in process and technology and then whatever remains the residual risk we address through dialogue and behavior change.
Tim Ward - 21:06 Yeah, it’s interesting because I think it does make you have to think and take a slightly different approach because for example, if you want good behaviors around passwords, then you could obviously be and obviously we focus quite a lot on nudging, but you could be intervening nudging, saying stuff around passwords. What are you going to tell them to do instead? If you haven’t got a good thing like an action you want them to do, like you haven’t got a password tool or a way of storing them, then what are you nudging towards? It’s much easier to nudge towards a good behavior than just say, don’t do that, without giving any guidance, well, what do you want me to do? It changes the way you have to think about things.
Janet Bonar Law - 21:48 I think it’s interesting you talk about passwords because we have a particularly horrible password policy, which is super complex and no password manager to assist with that at all.
Dinis Cruz - 22:03 That’s a good example of bad security. Right?
Janet Bonar Law - 22:06 Makes no sense. We have nothing to help. We have instead of a helpful password manager, we have a super strict password policy that if your password on audit quarterly is found not to be secure, you go on a very serious naughty list and miss disciplinary action. That is shocking and that needs to change.
Dinis Cruz - 22:34 I was in a very interesting meeting where I was saying that we should weaken our password policy if they had MFA. I was saying if you have MFA, then in a current setting environment we should have long password expiration dates and we should allow for weaker passwords compared to freaking crazy stuff, right? I’m not saying that you allow password one, but I’m saying that it’s about saying the single most important thing to do in passwords today is multifactorification MFA.
Danusia Rolewicz - 23:07 That’s like, how are you introducing MFA? How are you facilitating them to get on board with MFA? Because I was in a bank environment in another country and the procedure for customers to download the app and go through all of the motions to get that meant that I heard at least two different conversations where a customer had come in. It was quite open concept layout as well, on top of everything in the bank. I could hear two different people getting help from a clerk on how to set up their MFA, but in such a way that they were exposing data to anyone that could hear even a temporary password that the clerk suggested to them was everything was audible. You can have this really amazing secure app, but if people are going to find these workarounds and not be able to get on board and to download it and to get onto it.
Dinis Cruz - 24:19 Then it’s worth not usable. Absolutely right. That scales more, right, in a way which actually takes to a point where you could only have behavior if it’s usable. That’s what I like about, again, the word behavior because you’ve all seen that picture of what’s it called the gate, right? All the cars go around it, right? Because in a way to go around it, again, I like behavior because behavior you can answer why are we trying to do this? Right? Why we’re trying to do this to protect the users. Actually, again, there’s data that shows that sometimes if you make it more secure, it actually becomes less safe because people will you actually push users away. That’s why I always talk about safety versus security. When I presented the boards, I said my job here is not to make to not to have incidents and not to be secure.
Dinis Cruz - 25:18 Our job is to be safe. And that’s very different. Right. Going back, Tim, to your point, you can make a great secure system that’s connected to anything, right? It might be very secure, but it’s not very usable and it’s not very safe. It doesn’t meet the risk profile of the organization.
Tim Ward - 25:35 Your definition of safe is kind of secure and usable.
Dinis Cruz - 25:39 Well, I would say it’s aligned to the business risk appetite. Right. There’s a balance, right.
Tim Ward - 25:44 Appropriate security.
Dinis Cruz - 25:47 Here’s the irony, right? If you live in a place where you have to have a house with 25 locks and a machine gun and a guard outside, I can argue that house has a lot of security. Right? There’s a lot of security features on it. Right. But which one is safer? That one or a house that you can leave your door unlocked? Now remember that you can only leave your door unlocked if you’re in an environment, in a neighborhood that is safe. And why is safe? Because there’s a lot of other things that will do that from the geographical location, from the neighbors. There’s a lot of other things that come into play, right, to allow you to leave your door unlocked and also the kind of attackers you have there. I think you have to balance who is attacking you, what’s your requirements, why are you doing stuff, what assets you have that should determine the security and the implications of a compromise.
Dinis Cruz - 26:39 Clearly, if I have a pile of medical data and the pile of data is not really relevant, the security requirements should be very different. Even the definition of safety is very different from one to the other. Again, context and behavior matters, right?
Tim Ward - 26:57 Yeah. I think arguably some of that is taking that risk based approach, isn’t it? Because even in ignoring behavior, I’ve been in organizations where they’ve just had these well, if it’s got data, it’s got to be encrypted. You’re like, well, no, what’s the threat? Like, what different data has different kind of requirements? The saying comes down to where people are involved, I suppose, is just thinking about what’s the risk profile and what behaviors are the ones that we want to see. You get into the whole world of, okay, and then how do you get those behaviors? How do you help change and steer people to work in those ways?
Dinis Cruz - 27:39 Can we now then move to the second word, which I really like, or third actually, which is awareness. And I would just throw this. I think awareness by the users is also crazy important. Sometimes we try to get awareness by scaring, which is not always the right way to do it. I think once the users understand what they’re dealing with, they actually tend to be very protective and they actually tend to do the right thing in how they operate.
Tim Ward - 28:09 Yeah, I suppose we quite often go back to BJ Fogg’s model in this because from a behavior perspective and that has in it that element of motivation. BJ Fox says that any behavior has got a bit of motivation, a bit of ability, but also some prompt or trigger to act and obviously the tools around you give you the ability bit and your knowledge. Also you’ve got to have that motivation and some of that I think comes from the awareness you’re talking about, didn’t it? You’ve got to help people understand that yes, this is a threat and that it’s relevant because if they don’t have that motivation they’re not going to necessarily act. Slightly counter that you’ve also got to expect that quite often people aren’t going to act securely because they’re busy, they’ve got other things to do. Therefore when it comes behaviour change and want to work as hard as you can to make the easiest behavior the one you want and that’s a great example when it comes to passwords, how hard are you going to make?
Danusia Rolewicz - 29:15 Yeah, and equally I think going back to what you were saying Dennis, about scaring them, I don’t think it’s ever really the right way. Fear, uncertainty, doubt, FUD or EPPM, these are strategies that we can see being exploited historically in marketing or by social engineers. Really, why not focus more on the positive empowerment? We’re pre programmed to react to positive prompts more. Again, creating security messaging, creating programs where we can talk and involve people and give them a feeling of shared responsibility because it’s their choice to get involved. Once they have the buy in of being involved in for example, a Champions program, then that buy in, that investment that they’ve made, they’re not just going to throw that away, that’s going to be used much more powerfully. I think something that Tim often talks about is social force. Once you recruit a few people and they start talking about something and what they’re getting out of it, like how they’re benefiting and growing, then other people will want to join immediately.
Tim Ward - 30:43 Yeah, social proof can really be a great helpful thing to bring any behavior and you can kind of do things like say well, so and so has reported this, have you reported anything when it comes to fishing? You’re just even in a simple behavior you can kind of frame it like that. We’re all social animals so we want to be helping socially, we want to be seen to be doing what other people do. There’s other cognitive bus like reciprocity which is kind of similar where you can well, the security team have done all of this work blocking this stuff, we’ve given you this, we block this much, could you do your bit? People kind of feel like they ought to give back a bit? Probably depends how you word that slightly but there’s some useful ways that you can kind of encourage those behaviors.
Dinis Cruz - 31:34 As a Google strategy, which I just want to prefix that I’ve also seen a very interesting. Paradox that when you have a supply chain that goes from very small individuals to very large companies, you actually tend to have a weird situation where the ones that are most secure are, I would say, the large companies that have already regulation, compliance, bigger teams who actually are able to take security a lot more seriously. Right, not more seriously. They actually have very color resources and they understand better. Also the smaller companies tend to have that because they understand the problems and the side effects of cybersecurity in their world. Sometimes it’s the one in the middle that tend to be the worst ones who tend to actually fall into you want to give us your question? Yeah.
Speaker 5 - 32:24 Basically, from what Dennis was saying about the appetite for security, where a company might have all they might have a situation where they feel that they’re secure enough, but they might think that they have this false sense of security mindset where they believe that they’re too small to be targeted, which in essence is like for a lot of people that might go after cybercriminals, they prey on those, but they just don’t realize that. It’s just hard to really get them in that mindset that, hey, actually you should be taking things more seriously. And here’s why.
Tim Ward - 33:27 It’s a good point, Billy, and it goes back to that awareness point Dennis was making, that there is a bit of need to help people understand the threat and possibly of highlighting the risk. Not too much fear, but because just to help people understand that they might not even being targeted, but they’re going to be collateral damage, potentially. Quite often the way that attacks work is that they’re not targeting anyone, they’re targeting just general weaknesses. They’ll be firing this out across thousands of kind of endpoints or people. If your small business hasn’t got some of the basics right, you’re not the target, but you’ve become the target because you had a week.
Janet Bonar Law - 34:18 Yeah.
Danusia Rolewicz - 34:18 Especially with this at ransomware as a service model and the outsourcing involved in that anyone could be targeted just because why not? It doesn’t take very much, really effort on the part of the attacker, but.
Dinis Cruz - 34:37 I would follow the assets. If you go back again to behavior change, one of the places where you can see huge amount of policy behavior is on the third party supply chain where you start to see pressures from the more mature companies downwards to do the right thing. To be honest, that is correct because ultimately it’s about understanding that you might be a company that has quite large, but you depend on a chain of suppliers that if they are compromised down here, that is going to affect you. Also that’s an awareness for the company because the company might think, well, we are a small provider who would attack us. If they are aware, going back to the second thought of the word, right, if they’re aware that they can actually cause a huge amount of damage and the asset or whatever they’re providing is actually a mission critical element of a bigger company, then they need to take it where?
Dinis Cruz - 35:27 Again, I think it’s that behavior that you can also drive by creating economic models or social models. Because again, you talk about security champions, but it’s also to do with the companies who hold key elements of the supply chain. You can push security downstream and also allow good investment in security. Going to Billy’s point or why should we do it? What’s the value for us? Well, guess what? If companies also can get rewarded in the marketplace for having good security practices, for having good security investment, which there’s a direct correlation actually between security investment and good engineering, right, and good practices. Because most of the stuff that we want to do is common sense, right?
Tim Ward - 36:09 Good governance is quite often means yeah.
Dinis Cruz - 36:13 There’s a direct correlation even on potential due diligence. I always tell the companies I work for get us involved in due diligence because we’ll lower the price, right? Because of acquisitions, because security is a great way to do due diligence. Because you ask really good questions about how things work, not that they work, it’s about what actually is happening under the hood. A good company, when you ask how things operate, doesn’t go to the whiteboard and start drawing it, right? That’s crazy. Immaturity right. They should know how things behave. Right. I think that behavior is super critical across organization. I know who mentioned that we hacked the organization, but that’s how I think about this. We’re trying to find the ways to nudge the organization, to behave in the ways that is on their own interest, to be safe and more efficient.
Tim Ward - 37:04 Yeah, I mean, that’s an interesting kind of much higher macro level version of the behavior, isn’t it? It’s the regulatory environment that the kind of the market can provide to steer people. Obviously people like NCSC have been trying to do that with things like cyber essentials, haven’t they? They’ve been trying to create a framework where, well, okay. Now you do get, I mean, as a supplier, you get asked, have you got these different frameworks? And that has a positive impact. I suppose there’s then an interesting debate about where regardless of all these frameworks, where behavior becomes that entry point where people can still do something that breaks the whole bottle. How are you going to help those people? How are you going to help the individuals within those organizations as well to behave?
Dinis Cruz - 37:56 Well, that’s why I think we should be measuring behavior, right? Because behavior, again, driven by incidents, is real. It’s not theoretical. Basically, when something occurs let’s take Phishing as an example, right? If a company is targeting with phishing, what happens? Do people click on things and then hide it? Do people click on things and then report it? Do people don’t click on things? Right? What’s the behavior that you want from a security team? From my point of view, yes, it’s better if they don’t click, but they’re going to click on things, right? I much prefer somebody who’s comfortable enough to go to the security team and give us advanced warning that something just happened than somebody clicking, going, oh, I hope nothing went wrong, and then not give us advanced warning. We actually had an incident that was like that. We actually went back to users and says, look, the problem is not you did XYZ, the problem is you didn’t tell us that you just did that.
Dinis Cruz - 38:55 The fact that you did this, the fact that you went to a website and entered your credentials, for example, we would have done that. I cannot look a user in the eye and says you had fault when members of my team probably could have fault for the same thing, right? It looked just like the real stuff was very authentic. You cannot expect users not to fall for that. Once they notice that something was weird, what they should have done is tell us straight away because then we can make sure that the users of that password is not problematic. Again, what’s the behavior you want the users do you want the users not to click on things and they want the users to tell you and to be the your eyes and ears on the ground that stuff is happening. I prefer the second one.
Tim Ward - 39:42 Yeah, we talked about it earlier, didn’t we? This idea of understanding what are the behaviors you want to see in your organization, then thinking about how you can measure them. I think there’s of just thinking about what are you going to do with that? That you don’t become so we try our hardest to try and intervene with behaviors before or as they’re happening, or we use the behavioral psychology of ideas of like priming and things to steer people before the event. I think there are obviously tools out there that are looking at kind of post instant and post event and behavior. There’s just a worry in my head there that then you get into this cycle of punishment with training where you’re trying to change the behavior by going, we spotted that you did this thing, now we’re going to come. However hard you try, I think it’s quite hard for that not to be seen as telling someone off and trying to correct their behavior.
Tim Ward - 40:39 Whereas I think if you can try and do it before or as it can be a bit more looks like this might be about to happen. Are you sure you don’t want to do it this way and you can try and be a bit more back to January, bit more gentle, a bit more supportive, empathic and help people. I suppose.
Danusia Rolewicz - 40:58 Even in the post click scenario I think it doesn’t have to be about training. It could just be a discussion starting off a dialogue that actually could go into a security champion session or some situation where you could not expose that person but use their experience to understand the triggers and the context that are floating around.
Janet Bonar Law - 41:37 I would argue that in the post click situation that is possibly the worst time you could choose to try and deliver any kind of training because they’ve just clicked on the email, they’re annoyed that they’re being caught out and they’re quite upset about it and that is not a good receptive base for new information. I would argue it’s next to pointless.
Dinis Cruz - 42:06 I completely agree. We should be thanking them. The other thing is training should be in context to what their day job is. Then it makes sense, right? Or else it doesn’t again doesn’t secure. Billy, I had a follow up question. Do you want to yeah.
Speaker 5 - 42:22 Basically, I spent of time as a truck driver. One of the things one of the companies did was when went to pick up our load, we would have to do our pre trip inspection. Sometimes they would hide gift cards in places where we need to look for our inspection. Like $10 at the local coffee shop kind of a thing. Just as a reward for doing your job. Of course if you don’t do it properly then well, you lose out. I’m thinking you could do the same thing in the office space for some things like for instance setting it up in the office where a policy where if you have a suspicious email send it to suspicious@thecompanyname.com and then it will look at it, analyze it, see if it is and then maybe give that person reward. Like hey, you protected the company by reporting this suspicious email.
Speaker 5 - 43:53 Here’s a gift card for Amazon or something along those lines. Maybe if someone at the end of the year, if someone gets whoever gets the most maybe gets some kind of a reward or something like that, just like a recognition for doing that. I think that could put people in a proactive disposition where they’re more looking for these so they can report them. Otherwise it’s just like they just don’t really care about them. Even if they do see them as suspicious they might just ignore them or whatever. This way they’re being proactive and I think that would inspire others to do the same when they see people actually getting rewarded for that.
Dinis Cruz - 44:42 Absolutely.
Danusia Rolewicz - 44:43 I love the suspicious atcompanyname.com address. That’s amazing.
Janet Bonar Law - 44:49 I am going to steal that immediately.
Danusia Rolewicz - 44:53 There’s a lot of things that I’ve seen around in different organizations of gamifying security. One thing is like a Pokemon Go kind of approach to reporting or flagging issues, which it depends on the type of organizational culture that you’re looking at because it might not be appropriate for everybody, but I think what you’re saying could fit in quite nicely on that gamification shelf.
Tim Ward - 45:24 Yeah, I’ve seen it done quite well with fishing where it’s less about tricking and catching out and it’s more about saying, right, we’re going to run a phishing competition. It’s going to start really easy and then it’s going to get really hard and the people who get through to the kind of spot, the last most difficult email will win a prize or something. You’ve changed it, you’ve turned it on its head, you’ve made the whole thing quite fun and interesting and people step. The only slight issue is the self selection of people who are already probably the ones who don’t need to know the trading. You have to think about what you’re trying to achieve, but it gets people liking and talking about security.
Dinis Cruz - 46:03 Tim, you tell my question, because the question I was going to ask you is that when you do a phishing campaign exercise, do you tell the users and the It team or not?
Tim Ward - 46:14 So we don’t tend to do them. We think that there are other ways to tackle this, but I think it all comes down to what you’re trying to achieve with it. Because the reason we don’t use I don’t think the click rate is so variable depending on the bait, that you have to think like, why are you doing this? What are you trying to achieve? I don’t think the training afterwards is that effective, as Janet says. It’s kind of, why are you even doing these things? Maybe it’s useful to understand the difference between different sectors of your organization, like who’s most prone to authority based type attacks, in which case, no, you wouldn’t tell them because you are trying to get some useful insightful data which you can then use. I don’t know, janet, do you tell people?
Janet Bonar Law - 47:05 I don’t tell anybody. I just set the campaign up, let it know that the box is going to be busy and hit the Go button. Really? That’s what I do.
Tim Ward - 47:17 Stand back. Like the blue touch paper.
Danusia Rolewicz - 47:22 There was one organization that I was at where people were actively requesting phishing campaigns that was an indication of and it was specific teams as well. I think that ties into what Tim was saying, how you might get a group that really goes for that thing and that really wants to prove themselves and then maybe the other groups or teams within the organization, something else could work. Maybe developing a tailored approach where not everybody gets hit with the same thing or the same strategy could be useful.
Dinis Cruz - 48:05 Yeah. Let me share a couple of things here. Because I think this is on a topic, maybe some of you guys are already aware of this, but I think it’s quite cool. Can you see my screen? Do about the knife and framework? Have you heard of this framework?
Janet Bonar Law - 48:20 No.
Dinis Cruz - 48:22 If you haven’t heard about this is pretty awesome. The other one is Worthy Maps, which we’ve done a lot of sessions, but I think for this one, it’s a really good behavioral framework created by Dave Snowden that breaks the world into this different worlds. It’s very interesting to understand where you are and what behaviour change and to see in each of these. It’s a really good way of thinking about how people will react, how systems will react, how almost like trying to get an understanding of the behavior that you see is almost a consequence of the environment that exists. But the environment evolves, right? The evolvement goes from chaotic to complex to complicated and simple. Also each of these has certain resilience. A simple, for example, you might think is what you want, but actually a simple could be a monoculture. There’s one thing that happens, everybody does it, which then is very successful to massive disruption because if that thing stops working, everything so you actually sometimes fall from simple to chaotic because things can occur.
Dinis Cruz - 49:24 This is quite interesting again how you respond, like you act sense, respond here, you probe sense and respond. So it’s quite cool. Again, I’ll put the links in there and there’s some really interesting stuff about the framework, how you behave, et cetera. I definitely recommend you guys to see this in the presentation. And then the other one is.
Tim Ward - 49:46 Yeah, looks really interesting.
Dinis Cruz - 49:48 Yeah. The other one, I see this recently, but have you heard about this one by this guy Nicholas Means, who talk about the My Line incident. It’s a great story about how this was a nuclear incident occurred in US. How you can look at this from two different angles completely. One is kind of what happened in hindsight, almost looking for what went wrong, but then you can have another view. We does it the second part, which is what was the behavior that you would expect the individuals and that moment in time with that background, with that experience, with that stuff, to respond. You see they actually did what you would expect them to do, right? Of course, in hindsight you go why do you turn that valve along? Why do you do this, why do you do that? You would make in how those individuals were thinking, how they got brought up to their experience, what the signals they see, even the design of the place where you have all the alarms that was badly designed, had really bad usability, right?
Dinis Cruz - 50:49 It’s a great example of you say why did you miss the alarm? Okay, but if the alarm is on a table with 30 other freaking LEDs and things, and the mission critical flashing light is next to the splashing light that the elevator doesn’t work. Well, clearly that’s a problem. Right. From a Usability point of view, and there are tons of others. Right. I think these are really cool examples on behavior, that it’s about thinking about the environment that you create that then leads to the behavior of the individuals, which is, again, why I think if we can measure the behavior, you can sometimes even predict the behavior because you know that the ecosystem will occur on this in this good kind of environment.
Janet Bonar Law - 51:37 I think you touched on a really powerful thing there about the power of stories and connecting with an audience instead of just giving the facts. If you can weave the facts into a compelling story, the narrative kind of carries it into the brain. Remember stories better than remember list.
Dinis Cruz - 52:01 Absolutely. The story is critical. Yeah. Simon Worley on the worldly maps. If you haven’t seen the sessions on Worldly Maps on the Summit, just search the Summit website. There’s some amazing sessions there, right? Yeah. Simon Worley talks about how you want maps and how maps give you context, allowed to understand things, allowed to create a story about what happens, a narrative that then you then remember. Cool. Well, we want to talk about the hour. This is a really good session. We need to do more of this. I love this session. Definitely what the semi is all about. Any final words? Let me just go around the table. Any final words?
Danusia Rolewicz - 52:42 Yeah, I just think definitely I’ve got a few recommendations that I’m putting in the chat as you were showing us those great resources. I’m just trying to it’s not really working very well, but yeah, just try to understand your main audience, try to key into their needs, understand their behaviour change and empathetic route, and try to marry that bottom up approach with your top down compliance needs. If you take that route, then you’ll see more positive impact and behavior change than the compliance focused approach.
Dinis Cruz - 53:31 Tim?
Tim Ward - 53:33 Yeah, I suppose I’ve talked about this idea of kind of thinking a bit differently about security awareness and thinking about the behaviors that you want to see and then work out how you create a choice architecture, I suppose, where you make it easy for people to do the secure behavior. Our approach would tend to be to try and as you said, Dennis, measure things baseline, see what behaviors are taking place. Once you’ve understood that, start thinking about how you’re going to steer people in the right direction. We would talk about trying to do that as in real time as you possibly can. So you’re steering the right behaviors. I’ll put some links. We’ve got lots of behavioral science articles and awareness articles on our blog, so I can share that as well.
Dinis Cruz - 54:19 Yeah. Also send it to Alana because we can add it to the page. Right. Once we put the video there, we put it there. That would be really cool. Jeanette, any final words?
Janet Bonar Law - 54:30 Yeah, I think in the past, awareness has sometimes been seen as a broadcast activity, that we have the information and we broadcast it to everybody. Because everybody needs to know the same thing, we deliver it in the same way. Clearly we now understand that’s not the case. It’s not a broadcast activity at all. It’s a very interactive fiber anthropology exercise in trying to understand your audience and their needs and fit in your communications to shape their behaviors.
Dinis Cruz - 55:08 Yeah, couldn’t agree more. Good stuff. Oh, sorry. Finish any final? You look like you’re going to chip in with a final comment.
Danusia Rolewicz - 55:17 No, I just managed to send my message, that’s all.
Dinis Cruz - 55:22 Yeah. The other one that’s really powerful is the checklist, if you see in the book. What I really liked about that one was the idea that you don’t create a checklist for everything you want to do. You create a checklist to actually nudge the behaviors and the things that people miss. Right. It’s like, think about a pilot has a checklist, doesn’t have a checklist. Everything they want to do is almost a checklist of either the things they forget to do or to miss or the things that would trigger other behaviors that will then get you to do the right in. Right. Again, the way you design that system makes a big difference. So again, we can end on that. So thank you very much. This was a really cool session, and I’ll see you around in the other session of the summit. Let’s do come up with another session for the next summit.
Dinis Cruz - 56:04 This is really good.
Tim Ward - 56:06 Thank you very much.
Dinis Cruz - 56:07 Good.
Tim Ward - 56:07 See. Bye bye.