Americas

  • United States

Asia

Oceania

Contributor

Seven security cultures that can help or hurt your organization

Opinion
Feb 01, 20166 mins
IT LeadershipIT SkillsIT Strategy

It's 2016 - do you know where your security culture is? Because some cultures make the job easier than others.

It’s hasn’t been that long since my book People-Centric Security hit the shelves, but I’m already hearing “the question” pop up in my conversations.

“What’s the best security culture?”

There’s no one answer. “Good” culture depends on what an organization hopes to achieve. But since most security programs follow a first principle of preventing breaches, I can offer some example cultures that are more or less suited (“good” or “bad” approaches) to meeting that goal.

These lists are not ranked, nor are cultures mutually exclusive. No organization has a single culture, and good ones may coexist with bad ones. What is clear is that some cultures are going to make security program success easier, and some not so much.

First, the good ones…

Culture of Reporting

My last post was on cybersecurity whistleblowers. Best practice for managing whistleblower risk is implementing a culture of reporting. These are more common for harassment, safety, fraud, etc. The culture encourages people to report problems internally rather than going outside the organization. Of course, this only works when people know reporting a problem results in swift corrective action, and not apathy or retaliation. It’s more than just “see something, say something” – the organization must investigate and fix problems.

Cultures of reporting could be a security silver bullet. If everyone who identified a security problem reported it, and if the organization investigated and addressed every reported problem, security could change overnight. Unfortunately, this is expensive. Such cultures tend to exist only in places where lawsuits and losses from whistleblowing have shown fixing problems, even when costly, is inevitably cheaper than ignoring them.

Awareness Culture

Informed, engaged people are always valuable, in security or anywhere else. Cultures that educate and inform members steadily make securing information assets easier. My friends in security training and awareness are dedicated to fostering this environment in their organizations.

They are the “tip of the spear” for security culture. But awareness culture means more than making users acknowledge a policy, or endlessly phishing them. Awareness culture means people understand why security is important across the board. You won’t get it with a shoestring budget and lackluster commitment, any more than you will change an iceberg’s course with a canoe paddle.

Evidence-based (Security) Management

Engineering chops aside, security remains a relatively unscientific community. We build amazing technology, but have a harder time proving efficacy (“well, we didn’t get hacked, did we?”). In medicine, if you want to know whether a treatment is truly effective, you run a randomized controlled trial. If a CISO were to experiment this way, selectively providing security solutions to a “test” group, but their “control” group ended up being compromised, that CISO might end up fired. So, in security, no one gets the placebo and therefore we’re never able to prove scientifically that the treatment actually worked.

Evidence-based cultures collect empirical and historical data, analyze them, and make decisions based on the results, even if the results are unexpected or undesirable. Being forced to justify activities with evidence is inconvenient, but it does have the upside effect of promoting fact-based decisions over guesses and anecdotes.

Now some bad ones…

FUD-Driven

FUD-driven cultures are the opposite of evidence-based cultures. They prioritize emotion over rationality and new and close over old and distant. FUDdish cultures are like 24-hour news cycles: if it bleeds, it leads. People stress out over the latest frightening report, while neglecting dangers that have ceased to be novel. One CISO I knew admitted his staff regularly spent days researching whatever news article had scared him last. Add in the expediency of fear, which brings money and resources faster than rational argument, and it’s no wonder FUD becomes a crutch for organizations.

Cult(ure) of Technology

Technology is great. It’s important. Security couldn’t have gotten this far without it. But when organizations worship it as the single best security strategy, things go awry. Current “automate away the people” security narratives are examples of something both creepy and historically prone to failure. We talk about successful security requiring people, process and technology in a balanced portfolio. But many organizations have massively overweighted their portfolios with gear. And that lack of diversification is risky, as we are seeing today.

Checkbox Culture

Compliance is not security. Checkbox cultures are taking heat in the wake of big breaches, where the victims looked good on paper but not on the ground. However, these cultures remain sufficiently prevalent in industry to make this list. Checklists are wonderful tools that bring improvement to unstructured, undisciplined processes. But when people confuse the checklist with the process, you get problems. Like FUD, checklists can become perceived shortcuts to our goals. In overworked, underfunded environments that’s an overwhelming temptation. But it’s not security.

And finally, an ugly one…

Culture of Arrogance

Over my career, I’ve seen many Pogo-channelling security programs: 

“We have met the enemy and he is us…”

If a culture of reporting could dramatically improve security, there’s nothing like arrogance to ensure that every objective will be twice as far off, every success is twice as difficult, every failure is twice as painful. Security practitioners who see their users as idiots (or worse), their business leaders as incompetent, and the world as foolishly not getting why security is our biggest problem, shouldn’t be surprised when they meet resistance. For every security team complaining about stupid users, I can guarantee those same users have their own names about the security staff.

When one or more groups of people become incapable of respecting the points of view of others, much less working with them or compromising, progress can grind to a halt. Just ask Congress.

If you see your organization in any of these seven types, consider what it means for your security strategy over the coming year. Will your culture help you? Or does it presage another 12 months of struggle, frustration, and maybe even an incident putting the organization in an increasingly common and unwelcome spotlight?

Contributor

Dr. Lance Hayden, the Chief Privacy and Security Officer for ePatientFinder, is also an author, speaker, and researcher with over 25 years experience in the field of information security. A leading expert on security behavior and culture, Dr. Hayden is the author of People-Centric Security: Transforming Your Enterprise Security Culture and IT Security Metrics: A Practical Framework for Measuring Security and Protecting Data.

Dr. Hayden began his career as a human intelligence (HUMINT) officer with the CIA, which contributed to a philosophy emphasizing human behavior, organizational psychology, and strategic leadership as central to a successful InfoSec program. Dr. Hayden's career includes security roles at KPMG, FedEx, Cisco, and the Berkeley Research Group before joining ePatientFinder, where he has executive responsibility for all enterprise data protection and security-related regulatory compliance.

Dr. Hayden received his Ph.D. in Information Science from the University of Texas at Austin. As a professor at the UT iSchool, Dr. Hayden develops and teaches graduate and undergraduate courses on subjects including information security, privacy, surveillance and the intelligence community. His industry credentials include CISSP, CISM, CRISC and ISO 27001 Certified Lead Auditor certifications.

The opinions expressed in this blog are those of Lance Hayden and do not necessarily represent those of IDG Communications, Inc., its parent, subsidiary or affiliated companies.