hacking the mind - phishd.com · 1 hacking the mind: a psychological approach to the human aspects...

24
Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

Upload: lamhanh

Post on 24-Feb-2019

233 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

1

Hacking the Mind: A Psychological Approach to the Human Aspects of Information Securityby Adam Sheehan, Behavioral Science Lead, phishd

Page 2: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

2

Contents

○ Introduction 3Awareness: an essential but limited tool 4

○ Attitude Change 7Knowing is not the same as caring 7

Compliance vs motivation 7

So how do we change attitudes to cybersecurity? 8

○ Process Change 11Two systems of cognition 11

The need for scientific programs that target the unconscious mind 13

So what are the behavioral solutions? 14

How can level 3 be applied in practice? 16

Changing the message 16

Changing the context – priming and salience 19

Alerting the unconscious 19

○ Bringing it all together 20

○ Conclusion 21

○ References 22

Page 3: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

3

Introduction

Let’s start with a provocative statement: standard security awareness campaigns do very little to change how people act.

The reason for this is simple – the analytical and scientific rigor that underpins the sector’s approach to computer systems has historically not been extended to the human aspects of information security. Our industry has been enormously successful when it comes to technological security, but the factors that affect how people think and behave have generally been neglected.

The important question of whether risk is truly being reduced, thereby reducing expected organizational cost, has been lost.

Yet recent advances in the science of human behavior and decision-making mean that we can examine this vital area in a systematized manner – treating the mind like any other operating system.

This whitepaper presents our new approach: a synthesis of attitude-change psychology with a cutting-edge behavioral science perspective on the unconscious workings of the human mind. We believe that behavioral science is key to appreciating why there can often be a meaningful gap between security knowledge and security behavior, why this gap has little to do with ability or intelligence, and what we can do about it.

The resulting framework provides us with three factors that affect what an individual is likely to do when it comes to cybersecurity: knowledge, attitude, and process. It is these last two that the industry has tended to neglect, and which we explore here.

This whitepaper aims to:• Introduce our three-level model of security behavior change;

• Demonstrate how the industry can go beyond the bare bones of security awareness training;

• Serve as a call to action to the security industry to work in a way that fits with the realities of its consumers.

Page 4: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

4 1 (Tsohou, Karyda, & Kokolakis, 2015, p.1)

Awareness: An essential but limited tool

“Standards and best practices for information security awareness programs focus on the content of the programs, without taking into consideration how individuals internalize security-related information or how individuals make security-related decisions.”1

Many of us, at one point or another, will have taken a few moments from our day to sit through a standard security awareness training program. The ensuing watercooler conversations may have gone something like this:

Being lectured about changing passwords was something I didn’t need today

Exactly - do they not realize that I have real work to do?

That bit about spear phishing was quite cool though, I didn’t realize they could be that hard to spot

And they did say that almost everyone falls for it, so I guess it would just be normal to not spot it

It’s all just compliance, really....

And the phishing module was delivered by a talking cartoon fish

Here we can see some of the common problems, narratives, and rationalizations that prevent cyber security awareness programs from having the desired effect. It’s clear that it’s nearly impossible to achieve something as deep and human as culture change by merely conveying technical knowledge. Whilst most training is quite good at imparting knowledge, very little thought tends to be given to the psychology of the people actually receiving it.

Campaigns that only aim to raise awareness and provide security advice are psychologically one-sided and incomplete, especially when it comes to changing what people actually do for the better.

Yeah, and it’s really up to IT to keep us safe

Page 5: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

5

Three stage model of security behavior

AwarenessWhat you know

What you want

How you think

Level 1

Level 2

Level 3

Attitude

Processes

Deeper levels of processing are responsible for

greater amounts of behavior

Better solutions go deeper

Nevertheless, we consider awareness to be a crucial ‘first level’ of engagement, the potential of which can only be unlocked by going deeper. Changing attitudes towards training and the wider cybersecurity space is a crucial next step (as highlighted by our watercooler conversations above). We also need to know, at the most fundamental of levels, how people take decisions about cybersecurity.

Security behavior change can therefore be seen as having three key stages:

Page 6: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

6

Awareness is clearly essential, but the value of this information is only truly realized when employees are motivated and able to apply it.

The rest of this whitepaper is an exploration of levels 2 and 3, and what this all means in practice.

AwarenessThreat

VulnerabilityBest

Practice

Level 1

Most services will focus on level one – what your employees know. This can itself be broken down into three key elements:a) Awareness of the threat – how big is the problem, and what

would the impact be? b) Awareness of vulnerability – how likely is any of this, and

how could it happen?c) Awareness of best practice – what are we meant to do about the problem?

Page 7: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

7

2 (Park & Chai, 2018, p. 4732)3 (Kelman, 1958)4 (Albrechtsen, 2007)

Attitude Change

Knowing is not the same as caringAwareness campaigns typically assume that if we’re told all the risks, stats, and policies, then we’ll surely be highly motivated to do the right things. In practice, of course, it rarely works this way. Indeed, many employees at sophisticated organizations will already know the importance of complex passwords,

the need to lock workstations, and the fact that opening links in emails is not always very wise. So the real value of training is not telling employees the same thing over and over again in the hope that it sticks, but actually getting them to want to do something.

Knowledge does not equal attitude – and attackers rely on this.

Compliance vs motivation To unlock a true change in attitude, we have to get employees to really care.

Yet awareness campaigns often result in a rolling of the eyes, a feeling amongst employees that they represent a needless compliance directive from a faceless entity, and an (often justified) belief that the message isn’t really relevant to the specific organization. At best, the result will be a reluctant acceptance that the bare minimum has to be done to bypass further scrutiny – the so-called ‘compliance mindset’.

Indeed, employees with a compliance mindset, who grudgingly do what

they are told because they are aware of a carrot and stick, have been found to be a significantly higher risk to their organizations than those who have ‘internalized’ the need for security.3 When security is internalized, employees feel that the benefits outweigh the costs, that secure behavior is ‘the done thing’ for people in their position, and proud that their job and organization are of sufficient importance to merit protection.

Compliance requires at least a potential for observation and scrutiny, whereas internalization leads to security even when no one is watching.4

Employee Attitudes Compliance Mindset Security Internalization

Motivation Extrinsic – punishment and reward Intrinsic – values, esteem, and agency

Monitoring Requires scrutiny Little need for external pressure

Personal Life Unlikely to ‘take security home’ Will likely protect personal data and devices

Consequence Limited protection, cost for the organization in policing employees

Stronger protection, more mature culture

“Organizational information security is greatly influenced by employees’ willingness and attitudes”2

Page 8: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

8

So how do we change attitudes to cybersecurity?

Getting employees to care about security is clearly not a simple task, and it’s clear by now that this is far more complex than ‘carrot and stick’.

Our approach to the psychology of attitude change is scientific and theoretical, boiled down to a system that specifies exactly what we need to do to change attitudes. We identified three essential theories – those most often used to study attitudes to security.

We’ve brought their distinct contributions together to explain level 2 of our unified model.

The Theories:Brief Description One Key Contribution

Protection Motivation Theory5

People protect themselves when they perceive the threat to be high, and they feel that they are able to do something about it

Raising threat perceptions is only half of the equation – people must also feel able to take action, which psychologists term ‘self-efficacy’

Theory of Planned Behavior6

We plan our behavior by thinking about what we like, what we think is normal, and what we think we can do

Social perceptions are essential: we observe others and actively adapt our behaviors to fit in

Levels of Conformity7 Attitudes exist on a continuum from externally imposed to internally driven

Genuine motivation is much more powerful than the fear of punishment, and is achievable

Out of these theories emerge eight core components that need to be addressed in order to change security attitudes. These are our eight ‘Behavioral Determinants’; factors that fundamentally determine attitudes to security.

5 (Boss, Galletta, Lowry, Moody, & Polak, 2015; Hanus, 2017; Herath & Rao, 2009)6 (Ajzen, 1985; Ifinedo, 2012; Pattinson, Butavicius, Parsons, McCormac, & Jerram, 2015) 7 (Kelman, 1958)

Page 9: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

9

8 (Blythe, Coventry, & Little, 2015, p. 107)9 (Albrechtsen, 2007, p. 281)

The Behavioral Determinants

Behavioral Determinant

Description When It Goes Wrong When It Goes Right

Perceived Severity Our feelings and perceptions about the magnitude of the threat

We may understand the threat in distant and abstract terms, but it doesn’t feel close or ‘real’

The threat is well understood, resonates, and means something to us

Perceived Likelihood Our feelings and perceptions about the likelihood of the problem affecting us

We may be told that not only is the threat real but it is also likely to affect us, but this may not resonate emotionally

Our feelings and perceptions are in proportion to the likelihood of the problem

Self-efficacy The belief that we can do something about the problem

We feel unable to do anything about the problem. ‘Avoidance’ occurs when threat perceptions are higher than our ability to act

We feel that we are able to meet the threat and overcome it

Response Efficacy The belief that the recommended response can do something about the problem

We lose faith in the effectiveness of our policies, or see them as overly draconian

We feel that the tools and policies in place to support us are effective, and proportional to the threat

Response Cost Our beliefs about the extent to which security costs us productivity, time and/or money

“Do you want me to do my job?”8 We feel that the cost to productivity provides justification for sidestepping security recommendations

We believe that security protocols save us time overall, allowing us to do our day job more effectively (because they really do)

Personal Responsibility The extent to which we feel personally responsible for security

“Information security is not my job. I have to concentrate on my own working tasks, and trust that the security system is in place”9

The extent of our personal responsibilities is clear and unambiguous

Value Alignment The extent to which secure behavior is aligned with our personal values

We feel alienated or belittled by security policy or training

Security aligns with our values, in particular what we feel it means to do a good job at work

Social Perceptions The way in which we consciously understand relevant social norms

“No one else does the right thing, so why should I?”

We believe that it’s normal for someone in our position to do the right thing

Page 10: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

10

10 (Arachchilage, Love, & Beznosov, 2016)

Many security awareness programs intuitively attempt to tap into one or more of the above factors without taking such an explicitly scientific approach. But in the absence of a model like this, and the academic research behind it, we’d be groping in the dark for ways to change attitudes to security – potentially even being inadvertently counterproductive. For example, a common pitfall is to raise employees’ fear levels as high as possible, without paying particular attention to their efficacy beliefs. This leads to employees who are scared, disempowered, and ultimately avoidant of the problem – the so-called ‘Ostrich Effect’.10

A rigorously scientific approach will always beat one based solely on received wisdom.

AttitudeValues Personal

ResponsibilityPerceivedLikelihood

SelfE�cacy

SocialPerceptions

ResponseCost

PerceivedSeverity

ResponseE�cacy

Level 2

AwarenessThreat

VulnerabilityBest

Practice

Level 1

Page 11: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

11

11 (Kahneman, 2011)

Let’s say we stopped with changing attitudes. The conclusion would be that if we want to behave securely, and we know what we’re meant to do, then we’ve done all we can to protect ourselves from cybersecurity threats. But it doesn’t take much reflection to realize that there is clearly more to it. Let’s take the example of email phishing. Many of us (the author included) will relate to the feeling of having automatically clicked on a dodgy link in an email or text message, only to realize the mistake shortly after, slap ourselves on the head, and hastily analyze the odds of the message being part of a cyberattack. It is entirely normal to not always deliberate over every security-related decision we take – until it’s too late.

The security industry cannot be satisfied by only improving employees’ ability to take conscious, reflective decisions because the decision to click on any given link is typically taken without any conscious deliberation.

Two Systems of CognitionThis split between automatic and analytical behavior is something that psychologists have long been studying. Daniel Kahneman famously won the Nobel Prize in Economics in 2002 for his work into the fundamental difference between these modes of thinking. He demonstrated that the brain has two distinct decision-making ‘systems’: the automatic, fast and intuitive ‘System 1’, and the conscious, rational and effortful ‘System 2’.11 System 1 relies on heuristics – simple, unconscious rules of thumb that are very hard to consciously modify. We use these mental shortcuts a lot more than we think we do in all walks of life, and nowhere is this truer than in our security behaviors.

Process Change

Complexdecisions

Reliable

E�ortful

Conscious

Slow

Everydaydecisions

Error prone

Automatic

Unconscious

Fast

System 1 System 2

• Targeted by phishing• Accounts for ‘around

72% of behavior’• Change takes form of

‘nudges’, both social and environmental

• Targeted by awareness training

• Accounts for ‘around 28% of behavior’

• Change takes form of better knowledge and changes in motivation

• Knowledge ≠ motivation

Page 12: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

12

12 (Williams, Beardmore, & Joinson, 2017, p. 415)13 (Finucane, Alhakami, Slovic, & Johnson, 2000)

System 1 thinking is perfect in many ways for speeding up our ability to navigate the complexities of daily life, but it comes with a raft of vulnerabilities that can be systematically exploited by attackers. In this way it is analogous to any other operating system. Campaigns that stop at providing awareness and information do very little to address these unconscious decision-making vulnerabilities. The ‘psychology of a click’ therefore explains why smart and security-aware people can and do fall for a phish, as susceptibility is more than just a reflection of our conscious brainpower.

Key System 1 Security VulnerabilitiesVulnerability Brief Description Examples Security Risk

Availability Bias We judge risk based on what information is easy to bring to mind, as opposed to the quality or relevance of the information

Thinking that all phishing emails are easy to spot, because only the bad ones are easy to bring to mind

Being overly focused on certain attacks, e.g. ransomware, due to their exposure in the news

If a risk is hard to visualize, it may not be noticed

Statistics are hard to bring to mind, and so may have little effect on automatic behaviors

Affect Heuristic Our current mood influences security decisions more than it should, meaning we take different decisions about the same thing at different times

Falling for voice phishing due to being in ‘holiday mode’ on a beach somewhere

Sharing excessive information on social media due to feeling like threats don’t apply at this time

If we’re in a great mood, we can be less critical of risk

We can also be pushed into acting without thinking if we are under stress or perceive something to come from an authority figure

Representativeness Heuristic

We judge how likely something is to happen based on the extent to which it has certain noteworthy characteristics

We may rely on a ‘quick scan’ of emails looking for poor spelling or design, but missing other more important characteristics

We can be poor at judging how likely something is to be part of a cyberattack if it doesn’t have representative attributes

“Scammers … exploit the inherent heuristics and biases that govern what has been termed System 1 processing”12

Page 13: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

13

14 (Ambler et al., 2011; Parsons et al., 2017)15 (Sheeran, 2002)16 (Webb & Sheeran, 2006, p. 249)

17 (Ferreira, Coventry, & Lenzini, 2015; Zielinska, Welk, Mayhorn, & Murphy-Hill, 2016)

18 https://www.ncsc.gov.uk/blog-post/trouble-phishing

The need for scientific programs that target the unconscious mindBy now it has become clear that improving our analytical capacity to distinguish phishing emails from safe ones is hopeless if the decision never even reaches this level of engagement. And yet awareness campaigns are simply not designed to influence the unconscious and automatic System 1.

Research suggests that users can learn the answers to awareness questions, yet not show subsequent change to their real-world security actions.14 The problem is often not caused by a lack of user intelligence or motivation, but by the realities of the human mind – with research finding that in some cases as little as 28% of variance in behavior can be attributed to changes in intentions.15 Indeed, as one meta-analysis put it, ‘a medium-to-large change in intention leads to a small-to-medium change in behavior’.16 There is clearly then a significant proportion of real activity that has nothing to do with our conscious motivation or ability – as any security-aware user who has ever absently clicked on a malicious payload knows only too well.

All attackers try and make sure that their phishing emails evade the gaze of the conscious mind. But sophisticated phishers go further in terms of ‘priming’ users for System 1 processing, even though they might not necessarily describe it as such. Persuasion techniques such as similarity and authority (as well as many others)17 are carefully deployed to make the emails seem like other day-to-day ‘situations’ where it would be normal and suitable to use heuristic processing without engaging higher cognitive functions.

Standard approaches can imply that the ideal mental state is one of constant vigilance, where conscious deliberation is taken over every email that we get. The problem with this is obvious to anyone who has ever had an active email account. For this reason we agree with the National Cyber Security Centre when they argue that ‘we cannot expect users to remain vigilant all the time’.18 It’s unsurprising then that the majority of awareness campaigns tend to be received with apathy or even contempt, as they tend to make unrealistic demands on the time and headspace of people who have jobs to do.

To put it psychologically, we are not recommending that ‘System 2’ be constantly switched on – this would be extremely draining, not to mention tedious, and a waste of valuable cognitive resources. The solutions can be smarter than that.

Page 14: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

14

19 (Ambler et al., 2011; Dolan et al., 2012)20 (Service et al., 2014)

So what are the behavioral solutions?There is a need to meaningfully combine awareness with practical and powerful services that work to directly support our automatic minds. In fact, there is a huge amount that can be done to meaningfully change security behaviors – even ones rooted in the automatic ‘System 1’. This often takes the form of ‘nudging’ users in the right direction, an approach which won Richard Thaler a Nobel Prize in 2017, but sometimes employs more direct means.

We use two taxonomies developed over the last few years by the UK government behavioral insights team, the so called ‘nudge unit’: MINDSPACE19 and EAST.20 Each of the letters in these mnemonics stands for a different factor that influences unconscious decision taking.

These factors constitute our Behavioral Determinants when it comes to changing the unconscious social and cognitive processes that affect security behavior. A huge amount of research exists for each one of these Determinants – far more than can be covered here – revealing some fascinating synergies, potential contradictions, and clear opportunities. If we’re successful in influencing all of these factors, then we will have been successful in leveraging the System 1 mind in the service of organizational cybersecurity.

Processes Messenger Incentives Norm/Social Defaults Salience Priming

CommitmentsA�ect Ego Easy Attractive Timely

Level 3

AttitudeValues Personal

ResponsibilityPerceivedLikelihood

SelfE�cacy

SocialPerceptions

ResponseCost

PerceivedSeverity

ResponseE�cacy

Level 2

AwarenessThreat

VulnerabilityBest

Practice

Level 1

Page 15: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

15

MessengerWe are heavily influenced by who communicates information to us

IncentivesOur responses to incentives are shaped by predictable mental shortcuts such as strongly avoiding losses

NormsWe are strongly influenced by what others do

DefaultsWe ‘go with the flow’ of pre-set options

SalienceOur attention is drawn to what is novel and seems relevant to us

PrimingOur acts are often influenced by subconscious cues

AectOur emotional associations can powerfully shape our actions

CommitmentsWe seek to be consistent with our public promises and reciprocate acts

EgoWe act in ways that make us feel better about ourselves

Page 16: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

16

21 (UK Cabinet Office (Behavioral Insights Team), 2012, p. 3)

How can level 3 be applied in practice?

Changing the messageMessage modification is one of the widely researched applications of theory at the behavioral scientist’s disposal. Subtle modifications to messages and modes of delivery can have a surprisingly large effect on their effectiveness; working at a subliminal level to produce changes in social and cognitive processes.

It is necessary to be aware of and to proactively use these effects in all of our interactions with users, be it through classroom training, web-based training, or simulated phishing attacks themselves.

Here are three examples of how we apply the Behavioral Determinants in practice:

“Even relatively minor changes to processes, forms and language can have a significant, positive impact on behavior”21

Page 17: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

17

22 (Little et al., 2017)23 (Das, Kramer, Dabbish, & Hong, 2014)24 (Cialdini, 2003)

25 (Dolan et al., 2012)26 (Dolan et al., 2012, p. 268)

Social norms, and the processes that govern how these are understood, are arguably the most impactful of those things which govern how likely behavior is to change.22 It follows that when a norm is desirable, we should let people know about it. One study found that the use of social prompts significantly improved the extent to which 50,000 participants chose to address social media account security settings.23

Note how here a norm is merely implied – just because 108 of your friends performed an action doesn’t necessarily mean the action is ‘normal’ in the strictest sense. Nevertheless, the social proof was enough to make a difference, and the sole expenditure for this ‘gain’ was the effort of typing a few additional words.

It is also imperative to avoid reinforcing negative social norms

– the ‘big mistake’ made by those trying to change behavior.24 For example, we’ve lost count of the number of times we’ve seen those seeking to change password behaviors start by hammering home how unusual it is for people to set adequate passwords. This almost certainly has a counterproductive effect on the chances that an individual will change password behavior, as at a basic level he or she will feel that it is normal to ignore this advice, and therefore justified in continuing to do so.25

You can use security settings to protect your account and make sure it can be recovered if you ever lose access

108 of your friends use extra security settings. You can also protect your account and make sure it can be recovered if you ever lose access

Keep Your Account Safe Keep Your Account Safe

Improve Account Security Improve Account Security

“In recycling, when a hotel room contained a sign that asked people to recycle their towels to save the environment, 35% did so. When the sign used social norms and said that most guests at the hotel recycled their towels at least once during their stay, 44% complied. And when the sign said that most previous occupants of the room had reused towels at some point during their stay, 49% of guests also recycled”26

1. Social norms

Page 18: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

18

27 (Webb & Sheeran, 2006)28 (Little, Sillence, & Joinson, 2017)29 (Dolan et al., 2012, p. 268)30 (Cialdini & Goldstein, 2004)

2. MessengerWho the messenger is has a significant impact on the odds of a message changing what people do. We instinctively judge messengers based on the authority that we think they have, and as a result the perception of expertise has a powerful effect on the extent to which we believe what someone is saying, whether we know it or not.

One study into health behaviors found that trained facilitators were actually outperformed by research assistants in delivering healthcare interventions – surprising given that research assistants are not necessarily experts in training delivery.27 But the message had greater influence as it was seen to come from those with

first-hand experience, and here this effect superseded generic techniques designed to elicit that most slippery of concepts, ‘engagement’. We leverage the importance of expertise by letting users know about our frontline experience of cybersecurity, increasing the overall potency of our recommendations.

On the other hand, security experts can sometimes be seen as out of touch with the real pressures of employees, motivated to enforce overly stringent and hassling policies.28 We take care to avoid this pitfall through the application of our ‘level 2’ determinants, especially those to do with response cost and efficacy.

3. AffectAffect is a term broadly used by psychologists to describe the impact of emotion on cognition. By influencing how a message is likely to make people feel, we can influence the short-term emotional context in which someone takes a decision, and the long-term effectiveness of the message.

“In energy conservation, a US energy company sent statements that provided comparisons between a household’s energy use and that of its neighbors (as well as simple energy consumption information), with smiley faces if consumers were below the average. The scheme was seen to reduce energy consumption by 2% relative to the baseline.” 29

It’s remarkable that something as seemingly irrelevant as a smiley face – which we can assume that most wouldn’t even be aware of noticing – can have such a subliminal impact on choices. Note how this example seems to be at odds with the conclusions drawn in the ‘social norms’ section, in that behavior changed positively despite people being informed that they were deviating from an average. Here it seems that this effect was outweighed by a combination of affect, ego, and the effect of the messenger. Perhaps the energy company could have found a way to deploy these in a way that would have had synergy with social norms.

Back in cybersecurity, we leverage affect, along with commitment, by recommending that employees are not punished for a first offence. This typically engenders a feeling of reciprocity and quid pro quo.30 The ‘deal’ only needs to be implied, as opposed to being made ham-fistedly explicit, due to the unconscious mechanisms of its action.

Page 19: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

19

31 (Bateson, Nettle, & Roberts, 2006)32 (Little et al., 2017)33 (Morewedge & Kahneman, 2010)

Changing the context – priming and salienceIntuitive decision-making is highly influenced by environmental and contextual factors. One renowned experiment found that when an image of a pair of eyes was placed above an honesty box that was collecting payment for coffee, those paying for their drink paid nearly three times more than when an image of a bunch of flowers was used instead.31 This isn’t rational: a printed-out picture of eyes is hardly likely to report any cheapskates or take punitive action. It’s also highly unlikely that those who took part would think that changing these images affected their behavior in any way, let alone causing them to pay on average three times more for their drink. But the image of eyes primed System 1 to associate the environment with memories and images of ‘surveillance’, and perhaps also ‘honesty’, and thus the context meaningfully changed financial behavior.

phishd’s Outlook plugin is one example of how an understanding of contextual effects (‘priming’ and ‘salience’ in our model) can be leveraged to help users. The plugin’s primary function is the reporting and assessing of suspicious emails and is an integral part of empowering individual users to improve the security of their organization. However, its mere presence also has a useful secondary function. The plugin itself works as it changes the context in which users view emails, promoting the idea of ‘security’ and associated images in the intuitive mind – just as the image of eyes did for ‘surveillance’ in the honesty box study. By sitting in the peripheral vision of users as they receive their emails, the plugin works as a powerful but unconscious visual cue. Security is kept in the mind, but in a latent way which doesn’t place stressful demands on the user.

Some approaches to System 1 behavior change suggest incorporating ‘minor hassles’ into key decision-making points in order to slow down processing and force the engagement of (probably quite irritated) higher-order cognition.32 This is not a method we favor. It’s important to us that our solutions are hassle-free and seamlessly integrated into the user’s daily workflow – that it ‘makes it easy’.

Not all point-in-time training is created equal, of course, and it is important to map out and analyze the relative importance of specific attributes of phishing emails so that we can broaden the scope of ‘red flags’ beyond the obvious things such as spelling and dodgy URLs.

Alerting the unconsciousWhile most of us will be familiar with the sinking feeling of realizing we’ve downloaded something unpleasant, even more will have experienced the relief of having stopped ourselves just in time. We can increase the odds of stopping ourselves just in time by ‘nudging’ System 1 to bring in our critical mind at just the right moments. Our intuitive processes work via associative memory33 and we can leverage the mind’s unconscious associations to create ‘red flags’ at the appearance of certain cues. These ‘red flags’ roar up through our unconscious and cause us to stop, slow down and think – and once we’re at this point the chances of doing anything unwise are vastly reduced.

This also goes some way to explaining the popularity of ‘point-in-time’ training, where training is delivered when an employee clicks on a simulated phishing link, as most people intuitively understand that this mode of delivery creates a powerful association between clicking on a phishing email and a dramatically unexpected outcome. This is effectively a form of ‘classical conditioning’ in which an association is formed between a stimulus (a link embedded in an email) and an outcome (a surprising and unexpected result), leading to a hypothesized change in instinctive behavior. By doing this we increase the odds of reflection occurring ‘just in time’.

Page 20: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

20

By bringing together three distinct influences on human behavior – knowledge, attitude and process – we can robustly model how to assess and modify organizational security culture. This approach uses two sides of behavioral psychology, which have often been framed as opposing each other,34 as both offer valuable tools with which to drill into the mechanics of security behavior.

This whitepaper has been about factors affecting individual behavior, and it’s important to emphasize that this is not all there is to changing culture and reducing risk. Organizational factors such as process and technology are equally important, as illustrated in the diagram above. Only by effectively combining all of these elements can real behavioral change be achieved.

34 This reflects the general issue that in academia disagreement typically presents a better path to notability than agreement, somewhat reducing the odds of any external party wanting to pay attention

Bringing it all together

Processes Messenger Incentives Norm/Social Defaults Salience Priming

CommitmentsA�ect Ego Easy Attractive Timely

Level 3

AttitudeValues

PersonalResponsibility

PerceivedLikelihood

SelfE�cacy

SocialPerceptions

ResponseCost

PerceivedSeverity

ResponseE�cacy

Level 2

AwarenessThreat

VulnerabilityBest

Practice

Level 1

Page 21: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

21

Conclusion

This paper has aimed to shed light on the scientific underpinnings of the phishd offering, as opposed to delving into the mechanics of our services. By focusing our attention on the science, we’ve hoped to equip cybersecurity professionals with a psychological vocabulary with which to discuss security behavior change, enabling them to express concerns and suggest improvements with even greater authority.

For phishd, the emphasis has always been on delivering real cultural change, and a resulting meaningful reduction in the probability of a breach. The science of behavior is one of the key tools in our arsenal that allows us to measurably improve employee security culture for our clients.

For further information, including about the specifics of the phishd offering, please contact the author, Adam Sheehan, at [email protected].

Page 22: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

22

References

Ajzen, I. (1985). From Intentions to Actions: A Theory of Planned Behavior. Action Control, 11–39. https://doi.org/10.1007/978-3-642-69746-3_2

Albrechtsen, E. (2007). A qualitative study of users’ view on information security. Computers and Security, 26(4), 276–289. https://doi.org/10.1016/j.cose.2006.11.004

Ambler, T., Braeutigam, S., Stins, J., Rose, S., Swithenby, S., Bates, B., … Head, J. (2011). MINDSPACE: Influencing behaviour through public policy. Marketing Letters, 33, 1–50. https://doi.org/10.1111/j.1753-4887.2009.00206.x

Arachchilage, N. A. G., Love, S., & Beznosov, K. (2016). Phishing threat avoidance behaviour: An empirical investigation. Computers in Human Behavior, 60, 185–197. https://doi.org/10.1016/j.chb.2016.02.065

Bada, M., Sasse, A., & Nurse, J. R. C. (2015). Cyber security awareness campaigns: Why do they fail to change behaviour? Proceedings of the International Conference on Cyber Security for Sustainable Society, (July), 118–131.

Bandura, A. (2006). Toward a Psychology of Human Agency. Perspectives on Psychological Science, 1(2), 164–180. https://doi.org/10.1111/j.1745-6916.2006.00011.x

Bateson, M., Nettle, D., & Roberts, G. (2006). Cues of being watched enhance cooperation in a real-world setting. Biology Letters, 2(3), 412–414. https://doi.org/10.1098/rsbl.2006.0509

Blythe, J. M., Coventry, L., & Little, L. (2015). Unpacking security policy compliance: The motivators and barriers of employees’ security behaviors. Eleventh Symposium On Usable Privacy and Security (SOUPS 2015), 103–122.

Boss, S. R., Galletta, D. F., Lowry, P. B., Moody, G. D., & Polak, P. (2015). What Do Systems Users Have to Fear? U sing Fear Appeals to Engender Threats and Fear that Motivate Protective Security Behaviors. MIS Quarterly, 39(4), 837–864. https://doi.org/10.25300/MISQ/2015/39.4.5

Cialdini, R. B. (2003). Crafting normative messages to protect the environment. Current Directions in Psychological Science. https://doi.org/10.1111/1467-8721.01242

Cialdini, R. B., & Goldstein, N. J. (2004). Social Influence: Compliance and Conformity. Annual Review of Psychology, 55(1), 591–621. https://doi.org/10.1146/annurev.psych.55.090902.142015

D’Arcy, J., Herath, T., & Shoss, M. K. (2014). Understanding Employee Responses to Stressful Information Security Requirements: A Coping Perspective. Journal of Management Information Systems, 31(2), 285–318. https://doi.org/10.2753/MIS0742-1222310210

Das, S., Kramer, A. D. I., Dabbish, L. A., & Hong, J. I. (2014). Increasing Security Sensitivity with Social Proof. Proceedings of the 2014 ACM SIGSAC Conference on Computer and Communications Security - CCS ’14, 739–749. https://doi.org/10.1145/2660267.2660271

Dolan, P., Hallsworth, M., Halpern, D., King, D., Metcalfe, R., & Vlaev, I. (2012). Influencing behaviour: The mindspace way. Journal of Economic Psychology, 33(1), 264–277. https://doi.org/10.1016/j.joep.2011.10.009

Ferreira, A., Coventry, L., & Lenzini, G. (2015). Principles of Persuasion in Social Engineering and Their Use in Phishing. Human Aspects of Information Security, Privacy, and Trust, 36–47. Retrieved from http://publications.uni.lu/bitstream/10993/20301/1/FerreiraAna-CameraReady.pdf

Finucane, M. L., Alhakami, A., Slovic, P., & Johnson, S. M. (2000). The Affect Heuristic in Judgments of Risks and Benefits, 17, 1–17. Retrieved from http://www.anderson.ucla.edu/faculty/keith.chen/negot. papers/FinAlhSlovicJohn_AffectHeur00.pdf%0Ahttp://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.589.6788&rep=rep1&type=pdf

Hanus, B. (2017). Information Systems Management Impact of Users’ Security Awareness on Desktop Security Behavior: A Protection Motivation Theory Perspective. Information Systems Management, 33(1), 2–16. https://doi.org/10.1080/10580530.2015.1117842

Herath, T., & Rao, H. R. (2009). Protection motivation and deterrence: A framework for security policy compliance in organisations. European Journal of Information Systems, 18(2), 106–125. https://doi.org/10.1057/ejis.2009.6

Page 23: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

23

Ifinedo, P. (2012). Understanding information systems security policy compliance: An integration of the theory of planned behavior and the protection motivation theory. Computers and Security, 31(1), 83–95. https://doi.org/10.1016/j.cose.2011.10.007

Kahneman, D. (2011). Introduction. In Thinking Fast and Slow. London: Penguin Books. https://doi.org/10.1007/s13398-014-0173-7.2

Kelman. (1958). Compliance, identification, and internalization - three processes of attitude change. The Journal of Conflict Resolution, 2(1), 51–60.

Kirlappos, I., Parkin, S., & Sasse, M. A. (2014). Learning from “Shadow Security”: Why understanding non-compliant behaviors provides the basis for effective security. Usec ’14, (February), 1–10. https://doi.org/10.14722/usec.2014.23<007>

Kirlappos, I., Parkin, S., & Sasse, M. A. (2015). “Shadow security” as a tool for the learning organization. ACM SIGCAS Computers and Society, 45(1), 29–37. https://doi.org/10.1145/2738210.2738216

Little, L., Sillence, E., & Joinson, A. (2017). Behaviour Change Research and Theory (1st ed.). Oxford: Elsevier.

Morewedge, C. K., & Kahneman, D. (2010). Associative processes in intuitive judgment. Trends in Cognitive Sciences, 14(10), 435–440. https://doi.org/10.1016/j.tics.2010.07.004

Park, M., & Chai, S. (2018). Internalization of Information Security Policy and Information Security Practice: A Comparison with Compliance. Hawaii International Conference on System Sciences, 9, 4723–4731.

Parsons, K., Calic, D., Pattinson, M., Butavicius, M., McCormac, A., & Zwaans, T. (2017). The Human Aspects of Information Security Questionnaire (HAIS-Q): Two further validation studies. Computers and Security, 66, 40–51. https://doi.org/10.1016/j.cose.2017.01.004

Pattinson, M., Butavicius, M., Parsons, K., Mccormac, A., & Jerram, C. (2015). Examining Attitudes toward Information Security Behaviour using Mixed Methods. Proceedings of the Ninth International Symposium on Human Aspects of Information Security & Assurance (HAISA 2015), (Haisa), 57–70.

Service, O., Hallsworth, M., Halpern, D., Algate, F., Gallagher, R., Nguyen, S., … Kirkman, E. (2014). EAST Four simple ways to apply behavioural insights. Nesta, 53. https://doi.org/http://behaviouralinsights.co.uk/publications/east-four-simple-ways-apply-behavioural-insights

Sheeran, P. (2002). Intention—Behavior Relations: A Conceptual and Empirical Review. European Review of Social Psychology, 12(1), 1–36. https://doi.org/10.1080/14792772143000003

Tsohou, A., Karyda, M., & Kokolakis, S. (2015). Analyzing the role of Cognitive and Cultural Biases in the Internalization of Information Security Policies: Recommendations for Information Security Awareness. Computers & Security, 52, 1–24. https://doi.org/http://dx.doi.org/10.1016/j.cose.2015.04.006

UK Cabinet Office (Behavioral Insights Team). (2012). Applying Behavioral Insights to reduce Fraud, Error and Debt. Cabinet Office, London.

Webb, T. L., & Sheeran, P. (2006). Does changing behavioral intentions engender behavior change? A meta-analysis of the experimental evidence. Psychological Bulletin, 132(2), 249–268. https://doi.org/10.1037/0033-2909.132.2.249

Williams, E. J., Beardmore, A., & Joinson, A. N. (2017). Individual differences in susceptibility to online influence: A theoretical review. Computers in Human Behavior, 72, 412–421. https://doi.org/10.1016/j.chb.2017.03.002

Zielinska, O. A., Welk, A. K., Mayhorn, C. B., & Murphy-Hill, E. (2016). A temporal analysis of persuasion principles in phishing emails. Proceedings of the Human Factors and Ergonomics Society, 764–768. https://doi.org/10.1177/1541931213601175

Page 24: Hacking the Mind - phishd.com · 1 Hacking the Mind: A Psychological Approach to the Human Aspects of Information Security by Adam Sheehan, Behavioral Science Lead, phishd

24

Measurably improve employee security behavior

[email protected]

www.phishd.com

Follow us: