• United States




5 ways compliance hurts security

May 30, 20198 mins

The tasks of meeting regulatory requirements and providing true security that actually mitigates risk do not align. Here's how focusing exclusively on compliance can undermine security.

compliance / control / constraints
Credit: Z Wei / Simon2579 / Getty Images

Most of us in the IT security business know that compliance isn’t the same as security. Compliance is an auditing, paperwork, checklist mentality. Security is a tactical, real-world cybersecurity, risk-reduction mentality. Compliance is “Do you have a patch management program that applies critical patches in a timely manner — yes or no?” Security is figuring out which patches to apply and when, applying those critical patches, and then re-verifying those patches are applied. One helps you pass an audit. The other actually secures you.

Both are supposedly working at the same goal: to reduce cybersecurity risk. But compliance is so, so much worse at it that I’m not sure it does much to reduce real risk at all. Here are five reasons why:

1. Compliance is binary

Security risk is not binary, but compliance is about binary questions and answers, yes or no. Do you or don’t you do such and such? It doesn’t allow a lot of room for outside-the-box thinking or even stronger, better security. For example, most compliance regulations require complex passwords that are eight characters or longer. Even though a 20-character, non-complex password is inarguably harder to crack and easier to use, you wouldn’t be able to use it in most organizations.

Another example, most regulations require a policy that locks out a user’s account after the password has been entered incorrectly a set number of times. If you require sufficiently long passwords, then you don’t need the account lockout policy and you would have lower risk.

Increasing the default strength of your passwords won’t get you out of the requirement to enable account lockouts. In certain scenarios, account lockout policies increase the risk of a denial-of-service event. Password-guessing worms often try 100 random passwords, which will lock out the user whose account it is targeting. Or hackers will guess passwords against an online portal, again locking those users out.

2. Compliance is not relevant

Social engineering and phishing are responsible for 70% to 90% of all malicious compromises, yet you would be hard pressed to find more than a sentence about security awareness training or social engineering in any of the regulatory guides. Patching is the number-two problem, currently causing between 20% to 40% of compromises, and it usually has multiple paragraphs devoted to it. Storage encryption stops very few attacks and yet it usually has paragraphs of recommendations around it.

If you were to read any regulation and defend yourself based on the number of inches of text devoted to a particular subject, you would think encryption was your biggest worry. Regulations aren’t meant to be measures of risk, but it seems like a problem of imbalance if I have to comply with 200 things that do very little to actually decrease risk versus the one thing that would have the biggest effect.

3. Regulations are slow to change

All regulations are slow to change when new security advice comes out. In compliance documents you’ll find lots of mention of three-legged firewalls, DMZs and floppy disks. You won’t find a lot of information regarding how to better secure your cloud interactions, multi-factor authentication, ransomware, quantum computing, password re-use, third-party vendor risks, nation-state attacks and supply chain management. The world is moving and changing. IT security is moving and changing, but not so much for regulations.

4. Compliance always wins

The problem is that when security and compliance conflict, compliance always wins. CEOs and bosses are personally accountable to make sure the organization meets all compliance objectives. They don’t want to listen to you explaining why you had to file an audit exception because your passwords are stronger and better than what the compliance guide requires, because doing so will make you non-compliant with most existing regulations. Every second someone is working on making sure a checkbox is earned on a compliance checklist is a second that real computer security is not being worked on.

5. It’s all a lie

Here’s the biggest kicker. Everyone knows that compliance is a big sham. Everyone. Let me give you some examples. Every regulation requires that users make backups of critical systems and periodically test them. Somehow with every compliance audit I’ve seen, the audited entity says it does that. The reality is that almost no one does it, as all the successful ransomware attacks are revealing.

Yes, most entities back up most of their critical systems, but almost no one tests that a restore from those backups can successfully recover a system. Who has the time? Who has the staff to actually do this? Management is not giving IT the resources to do it. They don’t ask about it. They don’t care about it…until it’s too late. I bet 99% of the IT world has never tested more than a few backups in the history of the company, and yet almost every compliance audit has both sides agreeing that it is done. Same thing with “regular testing of controls.” Everyone claims they do it, but few do.

Let me give you another glaring example. Every regulation says that all critical patches should be applied in a timely manner (whatever timely means). In my 32-year career I have never seen a single entity I’ve checked be fully patched. I have never seen a single Cisco router appropriately patched in a timely manner.

I have never seen a single server fully patched. Everyone thinks they are fully patched. What they really mean is that all Microsoft patches are applied, and even that is rarely true. Even if the OS patches are applied, the server management software is out of date. Some underlying video encoder is out of date. Some of the server management tools that they have installed are out of date. By out of date, I mean they contain a publicly known vulnerability that can be used remotely to take over the server. 

Or they tell the auditor that they are 99% patched, and they can show them the report to prove it. What they don’t tell the auditor is that the 1% they haven’t patched are the ones that are mostly likely to be exploited by a malicious actor. Let me say it again, out of hundreds of companies and thousands of computers I’ve checked, none have been fully patched. Yet, everyone says it’s done on the compliance report.

Let me give you another common compliance lie. Every regulation says that all logs have to be regularly reviewed. Some regulations say daily. IT shops are not even sure where all their logs are, much less pick them up and regularly review them. The average computer has dozens of logs, most of which contain information relevant to security or applications. I’ve never seen more than a handful of logs picked up on most computers. Most of the log files go unfound and unreviewed.

No one regularly reviews any log. Can you imagine? People pouring step-by-step through the firewall logs every day? Hilarious to even think about. No one has time. So what most people who say they regularly review logs every day really mean is that they pick up some of the logs on some of the computers and they let some autonomous system review the log files for them, looking for predefined critical events and wait for them to produce alerts that something needs attention. Again, that’s only on some of the logs on some of the devices. The idea that anyone regularly reviews all log files is a useless pipe dream. But we all sign-off on it.

In every compliance audit I’ve seen, the audited team knows their network is full of numerous security holes. They believe their environment is a deck of cards, and if the auditor (or attacker) pulls on the right card, the whole thing will come falling down. The group being audited attempts to navigate the auditor around said holes. They pray that the audit gets done before the auditor asks the right question, or god forbid, actually tests the control.

The auditor knows this is going on. They are just trying to perform their job and not make the customer hate them. They feel like they have won a few battles and earned their money if they come up with a few things to throw into a report. No one will feel like the audit money was well spent if they don’t at least find a few things.

But overall, it’s a sham.

I can think of more reasons why compliance hurts security, such as the effort everyone wastes trying to reconcile meeting multiple compliance requirements, each of which is slightly different. Or how some guidelines are overly detailed and others barely have any details. The biggest problem with compliance is that it doesn’t track close enough to the underlying cybersecurity risk it is supposed to reduce. It’s a shame we don’t get more credit for implementing true security. It would be better if security got to win every now and then when compliance and security conflicted.


Roger A. Grimes is a contributing editor. Roger holds more than 40 computer certifications and has authored ten books on computer security. He has been fighting malware and malicious hackers since 1987, beginning with disassembling early DOS viruses. He specializes in protecting host computers from hackers and malware, and consults to companies from the Fortune 100 to small businesses. A frequent industry speaker and educator, Roger currently works for KnowBe4 as the Data-Driven Defense Evangelist and is the author of Cryptography Apocalypse.

More from this author