Why security awareness training is a waste of time

Bruce Schneier presents a very cogent and convincing argument that "security awareness training" is a waste of money -- specifically, because the benefits of "security" are intangible, while the benefits of getting your work done are apparent.

To those who think that training users in security is a good idea, I want to ask: "Have you ever met an actual user?" They're not experts, and we can't expect them to become experts. The threats change constantly, the likelihood of failure is low, and there is enough complexity that it's hard for people to understand how to connect their behavior to eventual outcomes. So they turn to folk remedies that, while simple, don't really address the threats.

Even if we could invent an effective computer security training program, there's one last problem. HIV prevention training works because affecting what the average person does is valuable. Even if only half the population practices safe sex, those actions dramatically reduce the spread of HIV. But computer security is often only as strong as the weakest link. If four-fifths of company employees learn to choose better passwords, or not to click on dodgy links, one-fifth still get it wrong and the bad guys still get in. As long as we build systems that are vulnerable to the worst case, raising the average case won't make them more secure.

The whole concept of security awareness training demonstrates how the computer industry has failed. We should be designing systems that won't let users choose lousy passwords and don't care what links a user clicks on. We should be designing systems that conform to their folk beliefs of security, rather than forcing them to learn new ones. Microsoft has a great rule about system messages that require the user to make a decision. They should be NEAT: necessary, explained, actionable, and tested. That's how we should be designing security interfaces. And we should be spending money on security training for developers. These are people who can be taught expertise in a fast-changing environment, and this is a situation where raising the average behavior increases the security of the overall system.

Security Awareness Training


  1. Eh, I don’t disagree that 90% of security is not letting the dumb bastards do dumb stuff, but there is reality and there is practicality.  If you take the opinion that users are hopelessly incompetent and seek to lock them out of anything that might cause a little harm, you are going to cripple a bunch of power users.  

    I have worked inside of companies that won’t let you install anything because they took the approach that all users are worthless pieces of shit.  If you want something installed, you call IT and they do it for you after ramming your face through a bureaucratic procedure.  It sucked.  It kept my boss from downloading the latest porn spyware, but it also made me give up on improving my own productivity.  The next company I worked for had the opposite policy.  You could do whatever you wanted within reason.  Did they have more internal security issues?  Probably, but I was also able to setup a small army of scripts and programs running across the network performing useful tasks to aid me in my engineering work that saved literally millions of dollars.  Lots of engineers were able to cobble together makeshift solutions that our IT department didn’t have the resources to support because our internal networks were light on security.

    Unless you are willing to cripple the users in the name of mindless security at all costs, you have to accept that some security is going to be reactive to stupid shit that users do.  You plan accordingly.  You can help reduce how much you have to react to stupid by trying to train users.  “Awareness” training might not be horribly effective, but it can be incredibly cheap.  A few power point slides explaining what malware is and how to avoid it costs almost nothing.  If it keeps an extra 20% of the employees from doing something stupid, you can call it a victory.

    Where I think training is the most worthwhile is when you target specific users with special needs.  If you have a VP traveling across the bordered into China, it is worth an hour of everyone’s time to make sure that his hard drive is encrypted and show him exactly how to use the encryption and how to react if someone tries to get him to power up the computer and let them copy the contents.

    1. He’s not saying the system should lock you out. That’s not security. He’s saying the system should be designed to be smart enough to filter out your dumb behaviour while letting you do what ever you want. Click all the links you want, the computer catches the bad ones. Invent your own password all day long, the computer will stop you from making one that is to easy to crack. Install what ever you need, the system takes off the mallware.

      A robust system, not a locked out system.

      Yes, it’s really difficult, if not imposible to build something like that, but it should be the end goal. Something to strive for.

    2. Unfortunately, if the report from Mandiant is to be believed, all the engineering work produced at your second company is most likely in the hands of APT1, which saved them millions of dollars, too. Further, when a Chinese-based company enters into negotiations to partner with that company, they have an advantage of knowing exactly what your company has to offer, the company’s financial health, and which of the employees the boss is sleeping with, which will certainly save them millions on the deal.

      I think it’s better to work from the idea that IT knows there are smart users and stupid users, but they need one policy. Better to get to know those IT folks, demonstrate your ability and willingness to work with them to achieve your goals while supporting their security objectives, and see what happens. In this way, I have had success in getting custom, one-off scripts and software installed with approval of IT, maintaining an environment where other users never see the emails enticing them to click here for free porn, or, more likely, ‘click here for the latest on the project update’.

      The report:

  2. Interestingly, this is a valid argument for organizations, but SAT would nonetheless benefit many individual users who are able to follow the advice.

  3. Why are we making everything more difficult (if not impossible) to use, and blaming users, when manufacturers make deeply flawed products, and continue to promote said products (eg DNSsec), and even go as far as shutting down anyone who suggests that it should be repaired!

    Are we blaming users for Sony getting hacked? What about canon producing hijackable wireless cameras? What about ADSL modems with remote admin enabled by default and standard passwords?

    Online security is an illusion, we can blame the users once we can provide them with a secure base to screw up from.

  4. Schneier’s great about insisting on realism, and he’s made similar arguments before. As have others — I’d remembered a Microsoft research article on how it’s rational for users to disregard security guidelines. https://www.schneier.com/blog/archives/2009/11/users_rationall.html

    A common security guideline is not to share or re-use passwords. But in nearly every place I’ve worked, there were significant shared resources with shared, easy-to-remember passwords. Nothing bad would come of the shared passwords — but occasionally, there would be a crisis when there was some needed resource secured by a password only known by someone on vacation.

    1.  Nothing bad would come of the shared passwords

      …That you knew of.  That’s the bummer of it – lots of the bad stuff your security measures are there to prevent, is hard to detect if it does happen.  You just, mysteriously, get underbid on a lot of contracts, or your customers, mysteriously, get a lot of fraudulent charges on their credit cards from businesses totally unrelated to yours.

  5. one of the catch phrases that is often bandied about by security awareness detractors is that there’s no patch for stupid. if that were true, though, then the people saying it would still be stupid. nobody popped out of the womb knowing how to make a secure password or how to avoid getting scammed.

    security awareness training is actually just security training with the acknowledgement that people need to be more security aware in order to put the training to good use, but without actually addressing the awareness component of the problem.

    the big problem with security awareness training is that you can’t train awareness in others, you can only inspire it. people look at security awareness training and see it not working but assume that security awareness itself is a lost cause instead of considering whether there might be a different approach towards the same end.

    there are other approaches to getting security closer to the forefront of people’s minds. some try to use fear, but that seems problematic (the heightened alertness fear produces can only be maintained for so long before people become numb to it). using humour is something that i’m fond of (part of why i made http://www.secmeme.com).

  6. The average computer user does not understand security, does not care about security, and does not really think that anything is going to happen that will affect them directly. You cannot plan on users being security conscious, they either are or aren’t. I doubt mandatory training is going to stick with people who are more focused on doing the thing they want to do than worrying about whether or not they are acting in an unsafe manner.
    Developers need to worry about security. People programming applications should be starting with a security focused framework.
    Operating System companies need to worry about security. Microsoft should have been producing an operating system whose number one priority was being as secure as possible from day 1 on. Instead, security was probably the last of their concerns generally speaking and has only been bolted on to whatever degree in the last few versions of Windows. It should be as close to impossible to do something hazardous as they can make it. Only after than should they provide the functionality that users will want. The Unixes, Linux and OS/X at least start with a filesystem that has some secure elements in the design. Most of the exploits that grant access to those OSes occur within software running on them, although even that argues they don’t go far enough.

  7. Guys, Bruce Schneier is Wrong: Internet Security Trains Employees to Have Judgement. My company KnowBe4, Security Awareness Training firm, counterpoints Bruce Schneier’s stance against Internet Security – says premise that people cannot be trained to have judgement is foolhardy and a liability to small and medium-sized enterprises. Here is our position:  http://www.prnewswire.com/news-releases/knowbe4-states-bruce-schneier-is-wrong-internet-security-trains-employees-to-have-judgement-199231341.html

    1. Well, Stu, you clearly have a financial motive to say this, and and at least at the other end of this link, all the evidence you provide is self-generated. Don’t get me wrong, I have all the respect in the world for Kevin Mitnick, but this post just looks like self-promotion to me.

Comments are closed.