Why security is in denial about awareness

Security awareness columnist Geordie Stewart explains why refusal to acknowledge legitimate criticisms of information security awareness puts users at risk

Denial has two meanings. It can refer to the refuting of an allegation or assertion. It can also refer to a psychological defense mechanism where criticisms are rejected because they are uncomfortable, despite evidence to the contrary. How a professional group responds to criticism tells you a lot about their ability to evolve and improve.

The awareness practitioner criticism of security awareness has been fascinating. In Why you shouldn't train employees for security awareness, Immunity Inc.'s Dave Aitel outlines reasons why he thinks money spent on security is money wasted. In response to that article, there have been rebuttals, such as Ira Winkler's Security awareness can be the most cost-effective security measure. There has also been an attempt to explain that bad security awareness techniques are all in the past. However, key points have been missed in the scramble to pick peripheral holes in awareness criticisms.

In his blog, Schneier on Security, Bruce Schneier states that security awareness is generally a waste of time. Since there's still a majority who think that awareness campaigns are about locking people in a room for an hour and putting up a few posters, Schneier is probably right.

[9 dirty tricks: social engineers' favorite pick up lines]

At the heart of this debate is a fundamental question: While many would agree that information security awareness techniques need to improve, are we talking about a few tweaks or a complete overhaul? The problem is that if security awareness is all about changing behavior, then why don't security awareness tools and processes look anything like other, more mature industries that take behavioral change seriously?

Compared to other industries, the information security awareness approach to behavioral influence is an embarrassingly amateur affair. In fields such as public health and marketing, there are experts who have spent decades studying behavioral influence, testing their assumptions and making systematic improvements to their methods. The approach in these fields has led to a heavy emphasis on audience research. Why did you buy that particular product and not another? What thought processes were you following when you plugged that in? They go beyond the 'what' of behavior and seek to understand the 'why'. In contrast, information security professionals persist with the delusion that they can manage the what without understanding the why.

Many ways exist to systematically understand the why of an audience. Web designers commonly use personas. Safety risk communicators have mental models. Information security folk models have also been proposed. The reality is that people have rules of thumb that they use to make decisions, such as: Is it growling and showing its teeth? Then I'm not going to pat it. Folk models are just a way of encapsulating these decision-making processes.

Generally, people's rules of thumb are adequate. When they go wrong, the information security tendency is to bombard an audience with facts, which is an extraordinarily inefficient approach. Some facts are more important than others and we need to identify specific 'fulcrum facts' on which decisions hinge rather than blindly 'teaching the topic.' Often, problem behaviors can be traced to a single mistaken perception. A good example that leads to a whole range of problematic behaviors is the belief that 'hackers don't target small businesses.' Information security professionals have been guilty of 'naïve realism' where we've assumed that our way of looking at problems is the only correct one. Despite our good intentions, our efforts will be hit and miss if we don't understand our audiences view of the world.

The cost of our mistaken approaches to security awareness should not be underestimated. How much has been spent on the password complexity topic alone? This problem could have been solved by system design but instead we've set ourselves the goal of trying to teach every last user. The crazy world of information security is such that Schneier was criticized for pointing this out.

[Security awareness quiz: What's wrong with this picture? The NEW clean desk test]

Safety professionals would be shocked at our endemic complacency where high-risk functions with no business benefits exist on our systems with the potential for catastrophic failure. Why do we allow users and administrators to perform unsafe acts such as selecting passwords like 'Password1'? Next time you get on a plane, consider the effort that's been made to systematically design out risk in areas such as pilot training and cockpit ergonomics. If security professionals designed an aircraft cockpit they would include a 'crash plane' button on the dashboard and then spend years training people not to press it.

Is it a good idea to manage human risks? Yes, absolutely. Influencing user security behavior is a very important part of any organization's defense in depth. However, its about time we dropped the enthusiastic amateur approach. Sure, information security awareness has had its handicaps, not least a mistaken perception that changing behavior is easy. However, until we acknowledge that a better understanding of user behavior is needed, and that it's not efficient to use awareness to cover up poor security design, then it's the users who will suffer.

It's ikely that due to the mix of specialist skills involved there's an increasing role for information security awareness marketing agencies with experts in communications and behavioral influence. This is very different from where we are now where security awareness is widely seen as an IT job that requires no particular communication skills.

Is it true that security awareness has allowed inefficiencies by compensating for bad design? Yes. Is there room to improve mainstream awareness techniques? Absolutely. Should security awareness be performed with a much better understanding of the audience? Definitely. Will you hear most awareness professionals admit it? Apparently not.

Geordie Stewart is a regular security awareness columnist for the Information Security Systems Association Journal.

Insider: How a good CSO confronts inevitable bad news
Join the discussion
Be the first to comment on this article. Our Commenting Policies