Why you shouldn't train employees for security awareness
Dave Aitel argues that money spent on awareness training is money wasted
By Dave Aitel, Immunity Inc.
July 18, 2012 — CSO —
If there's one myth in the information security field that just won't die, it's that an organization's security posture can be substantially improved by regularly training employees in how not to infect the company. [Editor's note: See Joe Ferrara's recent article 10 commandments for effective security training.]
You can see the reasoning behind it, of course. RSA got hacked from a Word document with an embedded Flash vulnerability. A few days later the entire company's SecureID franchise was at risk of being irrelevant once the attackers had gone off with the private keys that ruled the system.
But do phishing attacks like RSA prove that employee training is a must, or just the opposite? If employees and/or executives at RSA, Google, eBay, Adobe, Facebook, Oak Ridge National Laboratory and other technologically sophisticated organizations can be phished, doesn't that suggest that even knowledgeable and trained people still fall victim to attacks?
One of the best examples ever of the limitations of training is West Point's 2004 phishing experiment called "Carronade." Cadets were sent phishing emails to test their security. Even after undergoing four hours of computer security training, 90 percent of cadets still clicked on the embedded link.
Fundamentally what IT professionals are saying when they ask for a training program for their users is, "It's not our fault." But this is false—a user has no responsibility over the network, and they don't have the ability to recognize or protect against modern information security threats any more than a teller can protect a bank. After all, is an employee really any match against an Operation Shady RAT, Operation Aurora or Night Dragon? Blaming a high infection rate on users is misguided— particularly given the advanced level of many attacks.
I'll admit, it's hard to find broad statistical evidence that supports this point-of-view—not surprisingly, security firms don't typically share data on how successful or unsuccessful training is to an organizational body, the way West Point did. But I can share a few anecdotes from my company's own consulting work that should shed some light on this problem.
The clients we typically consult with are large enterprises in financial services or manufacturing. All of them have sophisticated employee awareness and security training programs in place—and yet even with these programs, they still have an average click-through rate on client-side attacks of at least 5 to 10 percent.
We also frequently conduct social engineering attacks against help desks and other corporate phone banks for customers. While each of the personnel in these security sensitive rolls has extensive training and are warned against social engineering attacks, the only thing that stops our testers are technical measures. In other words, if a help desk employee can technically change your password without getting a valid answer from you about your mother's maiden name, then a company like Immunity will find a way to convince them to do so.
We've also found glaring flaws—like SQL injection, cross-site scripting, authentication, etc.—in the training software used by many clients. This is more humorous than dangerous, but it adds irony to the otherwise large waste of time these applications represent.