Generally, people's rules of thumb are adequate. When they go wrong, the information security tendency is to bombard an audience with facts, which is an extraordinarily inefficient approach. Some facts are more important than others and we need to identify specific 'fulcrum facts' on which decisions hinge rather than blindly 'teaching the topic.' Often, problem behaviors can be traced to a single mistaken perception. A good example that leads to a whole range of problematic behaviors is the belief that 'hackers don't target small businesses.' Information security professionals have been guilty of 'naïve realism' where we've assumed that our way of looking at problems is the only correct one. Despite our good intentions, our efforts will be hit and miss if we don't understand our audiences view of the world.
The cost of our mistaken approaches to security awareness should not be underestimated. How much has been spent on the password complexity topic alone? This problem could have been solved by system design but instead we've set ourselves the goal of trying to teach every last user. The crazy world of information security is such that Schneier was criticized for pointing this out.
Safety professionals would be shocked at our endemic complacency where high-risk functions with no business benefits exist on our systems with the potential for catastrophic failure. Why do we allow users and administrators to perform unsafe acts such as selecting passwords like 'Password1'? Next time you get on a plane, consider the effort that's been made to systematically design out risk in areas such as pilot training and cockpit ergonomics. If security professionals designed an aircraft cockpit they would include a 'crash plane' button on the dashboard and then spend years training people not to press it.
Is it a good idea to manage human risks? Yes, absolutely. Influencing user security behavior is a very important part of any organization's defense in depth. However, its about time we dropped the enthusiastic amateur approach. Sure, information security awareness has had its handicaps, not least a mistaken perception that changing behavior is easy. However, until we acknowledge that a better understanding of user behavior is needed, and that it's not efficient to use awareness to cover up poor security design, then it's the users who will suffer.
It's likely that due to the mix of specialist skills involved there's an increasing role for information security awareness marketing agencies with experts in communications and behavioral influence. This is very different from where we are now where security awareness is widely seen as an IT job that requires no particular communication skills.
Is it true that security awareness has allowed inefficiencies by compensating for bad design? Yes. Is there room to improve mainstream awareness techniques? Absolutely. Should security awareness be performed with a much better understanding of the audience? Definitely. Will you hear most awareness professionals admit it? Apparently not.
Sign up for Computerworld eNewsletters.