It's hasn't been that long since my book People-Centric Security hit the shelves, but I'm already hearing "the question" pop up in my conversations.
"What's the best security culture?"
There's no one answer. "Good" culture depends on what an organization hopes to achieve. But since most security programs follow a first principle of preventing breaches, I can offer some example cultures that are more or less suited ("good" or "bad" approaches) to meeting that goal.
These lists are not ranked, nor are cultures mutually exclusive. No organization has a single culture, and good ones may coexist with bad ones. What is clear is that some cultures are going to make security program success easier, and some not so much.
First, the good ones...
Culture of Reporting
My last post was on cybersecurity whistleblowers. Best practice for managing whistleblower risk is implementing a culture of reporting. These are more common for harassment, safety, fraud, etc. The culture encourages people to report problems internally rather than going outside the organization. Of course, this only works when people know reporting a problem results in swift corrective action, and not apathy or retaliation. It's more than just "see something, say something" - the organization must investigate and fix problems.
Cultures of reporting could be a security silver bullet. If everyone who identified a security problem reported it, and if the organization investigated and addressed every reported problem, security could change overnight. Unfortunately, this is expensive. Such cultures tend to exist only in places where lawsuits and losses from whistleblowing have shown fixing problems, even when costly, is inevitably cheaper than ignoring them.
Informed, engaged people are always valuable, in security or anywhere else. Cultures that educate and inform members steadily make securing information assets easier. My friends in security training and awareness are dedicated to fostering this environment in their organizations.
They are the "tip of the spear" for security culture. But awareness culture means more than making users acknowledge a policy, or endlessly phishing them. Awareness culture means people understand why security is important across the board. You won't get it with a shoestring budget and lackluster commitment, any more than you will change an iceberg's course with a canoe paddle.
Evidence-based (Security) Management
Engineering chops aside, security remains a relatively unscientific community. We build amazing technology, but have a harder time proving efficacy ("well, we didn't get hacked, did we?"). In medicine, if you want to know whether a treatment is truly effective, you run a randomized controlled trial. If a CISO were to experiment this way, selectively providing security solutions to a "test" group, but their "control" group ended up being compromised, that CISO might end up fired. So, in security, no one gets the placebo and therefore we're never able to prove scientifically that the treatment actually worked.
Evidence-based cultures collect empirical and historical data, analyze them, and make decisions based on the results, even if the results are unexpected or undesirable. Being forced to justify activities with evidence is inconvenient, but it does have the upside effect of promoting fact-based decisions over guesses and anecdotes.
Now some bad ones...
FUD-driven cultures are the opposite of evidence-based cultures. They prioritize emotion over rationality and new and close over old and distant. FUDdish cultures are like 24-hour news cycles: if it bleeds, it leads. People stress out over the latest frightening report, while neglecting dangers that have ceased to be novel. One CISO I knew admitted his staff regularly spent days researching whatever news article had scared him last. Add in the expediency of fear, which brings money and resources faster than rational argument, and it's no wonder FUD becomes a crutch for organizations.
Cult(ure) of Technology
Technology is great. It's important. Security couldn't have gotten this far without it. But when organizations worship it as the single best security strategy, things go awry. Current "automate away the people" security narratives are examples of something both creepy and historically prone to failure. We talk about successful security requiring people, process and technology in a balanced portfolio. But many organizations have massively overweighted their portfolios with gear. And that lack of diversification is risky, as we are seeing today.
Compliance is not security. Checkbox cultures are taking heat in the wake of big breaches, where the victims looked good on paper but not on the ground. However, these cultures remain sufficiently prevalent in industry to make this list. Checklists are wonderful tools that bring improvement to unstructured, undisciplined processes. But when people confuse the checklist with the process, you get problems. Like FUD, checklists can become perceived shortcuts to our goals. In overworked, underfunded environments that's an overwhelming temptation. But it's not security.
And finally, an ugly one...
Culture of Arrogance
Over my career, I've seen many Pogo-channelling security programs:
"We have met the enemy and he is us..."
If a culture of reporting could dramatically improve security, there's nothing like arrogance to ensure that every objective will be twice as far off, every success is twice as difficult, every failure is twice as painful. Security practitioners who see their users as idiots (or worse), their business leaders as incompetent, and the world as foolishly not getting why security is our biggest problem, shouldn't be surprised when they meet resistance. For every security team complaining about stupid users, I can guarantee those same users have their own names about the security staff.
When one or more groups of people become incapable of respecting the points of view of others, much less working with them or compromising, progress can grind to a halt. Just ask Congress.
If you see your organization in any of these seven types, consider what it means for your security strategy over the coming year. Will your culture help you? Or does it presage another 12 months of struggle, frustration, and maybe even an incident putting the organization in an increasingly common and unwelcome spotlight?
This article is published as part of the IDG Contributor Network. Want to Join?