2 critical ways regulations and frameworks weaken cybersecurity

Security regulations and frameworks are good and necessary, but they can be inflexible and draw focus away from the most significant security risks.

I’m a big believer in regulations and frameworks. Early on I wasn’t. When you’re young, just starting to cybersleuth, you feel like you can take on the world. You can hack anything. You can prevent anyone from hacking you. Policies and frameworks were for the losers who couldn’t secure their way out of a paper bag.

Then you learn that, yeah, you might be able to secure your computers, but it doesn’t scale once you past five devices. You certainly can’t manually secure 100 computers perfectly over the long term. A thousand computers? Fuhgeddaboudit. You learn that all the smarts and talent in the world don’t mean a thing if you can’t put your ideas and practices into a document that people and devices then follow. You can’t get true long-term security without written policies and procedures.

That concept continues as you scale past a single company. You can secure a single organization with written policies and procedures, but it takes industry or government regulations and frameworks to secure everyone. Good, long-term security for the entire macrocosm will not happen without regulations and frameworks that companies are forced to follow. Voluntary participation does not work for computer security.

As flawed as some regulations and frameworks are, they can only help give us better computer security. As I’ve matured, I’ve come to love NIST, ISO, PCI-DSS, HIPAA, NERC, SOX and all the other legal requirements and frameworks I used to complain about. Sure, I still have big issues with them, especially when they become rudimentary checkoff documents instead of real security. Flaws and all, they are a way toward better computer security.

Two things still bug me about regulations and frameworks: lack of agility and lack of focus.

Cybersecurity regulations and frameworks restrict agility

By their very nature, regulations and frameworks are slow and inflexible. When better ideas come out or circumstances change to point out a better solution, they aren’t quickly updated to follow that better advice. For example, NIST has been saying for years (in Special Publication 800-63-3, Digital Identity Guidelines) that passwords should not be overly long, complex or frequently changed. Despite that strong federal guidance, every single regulation and framework currently in place requires long, complex and frequently changing passwords. After talking with several regulatory bodies, I don’t see any evidence that the old, weaker password advice that they require will change anytime soon. It’s clear in this case that regulatory requirements are actually weakening our overall computer security.

Albeit, many computer security professionals don’t believe that NIST’s new password policy advice is actually better. Most would rather you use a password manager, which you then use to generate long, random passwords for every website you use. Yet, most regulations and frameworks don’t mention password managers. So, you’ve got two camps of password advice givers…those that believe and want to follow NIST and those that say you should have even longer and more complex passwords using password managers, but none of the current regulations and frameworks actively support them.

That is the very nature of written policies, regulations, and frameworks. They are slow. Maybe that is a good thing. If regulations and frameworks changed on a culture whim, perhaps we’d all be whiplashed and beat down by switching back and forth.

Another part of the same problem is the lack of flexibility, even if the security policy trying to be followed is better and stronger than the requirement. For example, a 20-character, non-complex password is demonstrably harder to crack than a six- to eight-character complex password. Because every regulatory framework requires complexity, you cannot use the longer, better password (unless you also institute complexity).

Or another example: most regulations require account lockout, so that someone who guesses incorrectly at a password too many times is locked out of the account, either until someone resets it or after a set time. If you require sufficiently long and complex passwords, they will never be successfully guessed.

You’d be far better off requiring very long and complex passwords and forgetting account lockout, because when account lockout is enabled, you are always at risk for a denial-of-service (DoS) attack. All an intruder would need to do is keep making random guesses using each possible logon account name until all the accounts were locked out. This happened in real life to many organizations. Account lockout can be a dual-edged sword.

Most regulations don’t care. They require account lockout to be enabled and that is that. They do not care about your possible DoS attack. Even if they did…as I’ve already said…they are slow to change, and they probably wouldn’t add the flexibility for another 10 years.

Regulations and frameworks lack focus on the real security problems

Here’s my even bigger problem with most regulations and frameworks: Not enough focus. For example, we all know that 90 percent or more of all successful malicious data breaches happen because of two issues: unpatched software and social engineering. That means if you add up every other problem and its mitigations, you would only account for 10 percent or less of the cybersecurity risk facing your organization.

Yet, in every regulation and framework document I’ve ever read, those two huge cyber risks account for only a few sentences in documents often stretching over 100 pages. Your organization is supposed to follow every sentence of the document, especially if you’re being audited against that document, and yet most of your cybersecurity risk is addressed in a few sentences.

Perhaps it is not the goal of regulatory documents and frameworks to tell a following organization about which things to focus on the most. By their very nature, they force followers to address all things more equally, when that clearly should not be the case. In my imaginary world, a regulation or framework document would devote significant amounts of space, detail and recommendations to the two things that cause most of the problem. The sections would be bolded, shadowed and have flashing arrows pointing to the most important parts.

But they don’t.

Instead, the best, most significant advice is literally buried in voluminous documents destined to be treated with less focus than they should. It’s a crying shame. An auditor can be forgiven for asking “Do you patch your software?” and “Do you do security awareness training?” and filling out the checkmarks as everyone nods in agreement that these two things are done, without an iota of understanding how important it is to get those two things done right--and “more right” than all the other stuff.

I still believe in the power of regulations and frameworks. We can’t get better, long-term security on a massive scale without them. I just wish they were more agile and focused more on the important stuff. Because creating a document and auditing checklist that unfairly colors all requirements as more equally than they should be is part of the problem, not the solution.

Related:

Copyright © 2019 IDG Communications, Inc.

Get the best of CSO ... delivered. Sign up for our FREE email newsletters!