Cargo cults, certitude, the Byzantine Empire and avoiding the traps of the past

We can learn a lot from our history. Both fairly recent historic events such as World War II, and events from the middle ages such as the fall of Constantinople both show us how we react during times of crisis. We can apply lessons learned from these events to our current situation in information security and use them to move forward.

History

The John Frum movement in Vanuatu is based upon the belief that following rituals and actions of the United States Navy that occupied their islands in the crisis of World War II will bring material wealth and prosperity.  This movement, which recently celebrated its 50th anniversary, expresses itself in reconstructions of military bases and runways, and its adherents dress in costumes and drill in costumes reminiscent of the Navy.

One of the most powerful devices that has permeated the media, and has taken hold of our consciousness, is certitude, or the need to be right.  Social Media has taken this to an extreme, where algorithms once meant to recommend content can trap people in one point of view.

The singular point of view in the John Frum movement and social media brings to mind the Eastern Orthodox Church after the fall of the Byzantine Empire in 1453.  The church then considered this a failure of faith and redoubled their devotional efforts.  Long standing services were not uncommon.  Lack of total faith was seen as a weakness.  In reality, the fall to the Ottoman Empire took place over centuries and was a gradual decay that culminated in a final battle with a symbolic meaning.

Where are we at today?

Where we are in Security these days is that many parts of the market operate like a cargo cult, with current security events hitting us like the fall of the Byzantine Empire.  Many of us follow rituals and actions that others we believe to have ultimate skills or knowledge tell us to do, with certitude that they will put us in the land of security nirvana.  However, when the inevitable breach of security or attack happens, there are five types of thinking that usually occur.

The 5 types of thinking

The first is thinking that we’re not doing enough to protect ourselves, and that we need to redouble our efforts by performing more rituals and actions, i.e. buying more security goods and services to fix the holes that others tell us we have, and that this time they will work.  We also perform actions that others tell us to do.  We build our own John Frum movements to solve the security problem.  As the residents of Vanuatu can attest to, building airstrips doesn’t guarantee a Navy plane full of supplies will land.  Buying something big and shiny because someone else tells you that you need it doesn’t guarantee protection.

Secondly, we tend to look inward, rather than seeing the big picture.  We see our peers as the cause of the problem because they don’t follow the rituals to the letter.  Many organizations discipline or terminate employees for clicking on “phishing” links, or other specious violations of acceptable use policies.  While there are many cases where this is warranted, specifically with insider threats or straight ignorance, there are many cases where well-meaning people have been disciplined or let go because of policies that focus on removing deviations, rather than continual learning and experience.  We need to improve how we train and educate based on the big picture.  Purging people who are not 100% in lockstep with you is a sure-fire way to cause disengagement.  We need to be focused on the real root causes, and address those instead.

Third, people want to be valued.  When you purge them, or refer to them as “stupid users” or something similar, you create a level of disengagement.  People want to be valued, and they don’t want to have negative connotations attached to them.  Everyone in general hates being given these or told that they are wrong.  When you do this, they will rebel against established norms and push harder to perform rituals and actions to address the issue and ignore you.  Examples would include purchasing products with no intrinsic value from vendors who empathize and are good communicators.

You need to show empathy, listen, and understand that everyone has a story.  We need to understand their story and demonstrate that we want to work with people.  This is the part of the job that is difficult for many and is normally learned only through experience.  We need to show that we want to work with our customers and address issues.  If we don’t do this, someone else will, and it may not be the right person.

Fourth, people will come to conclusions and find evidence to support them, rather than using the scientific method and searching for the real answers.  The narrowing focus of social media doesn’t help here.  Crisis thinking, where people are searching for whatever method will get them out of one as soon as possible, contributes.  It’s this type of thinking that leads to ransomware being successful.  This leads to decisions being made based on gut feelings rather than analysis and leads back to redoubling rituals and actions when they don’t work.

Fifth and final is the fallacy of executive thinking.  When I started as a CISO in 2008, one of the lessons I learned from certain executives was that they had to be right 100% of the time, and that they always had to be correct.  I was also told that the average tenure of an IT executive was less than that of an NFL running back.

You are only as accurate as the data you have to make decisions.  There are many times when executives will be wrong.  However, any executive that promulgates the fallacy that executives are always right leads to people questioning their own judgement or decision-making process when executives make incorrect decisions.  When you have people question that, the ones who have a choice to leave that environment and who feel they are not being listened to will.  

Well-meaning executives who give the illusion of perfection don’t engage their teams, and set expectations that they will never meet.  Thinking you’re perfect means that there will be wrong decisions made that aren’t communicated, and they will be discovered long after the fact.  It also means that you hold yourself to unreachable standards, and don’t motivate the team to exhibit critical thinking or try harder.  Why try if the culture has a perfect leader who will shoot you down or give you the red pen every time you do so?  When you do that, you end up with leaders who have nothing to say, and a demotivated and disengaged organization that even the best executive would have trouble fixing.  You also end up falling back to rituals and actions to please the executive, rather than addressing the root causes.

Responsibilities of a Modern Information Security Executive

The responsibilities of a modern information security executive include excellent communication with all levels of the organization and soliciting input and questions from everyone.  Our engagements have dual meaning, which are to educate the team and to build enough rapport so they can report issues to us knowing that we will address them at the highest level possible.  We need to do more outreach than ever before.  Most importantly, we need to understand the importance and value that everyone brings, and that we will not always be right.  We need to listen to everyone’s story and speak to them to be successful.

How do we avoid creating more cargo cults in security?

We need to be genuine, honest, and open about our processes and expectations.

We need to continually educate and build rapport with our customers.

We need to communicate the big picture regularly and show how we build our decisions based upon the scientific method.

We need to educate.

We need to address the root causes of issues, rather than sacrifice others to meet policy-defined norms.

Most importantly, we need to address executive fallacy and build out a team that listens to the stories and builds the narrative based around them, rather than top-down management.  That doesn’t work when your entire organization is built around helping everyone mitigate risk, and increases

When you do this, you can help avoid the cycles which lead back to a John Frum mentality.

This article is published as part of the IDG Contributor Network. Want to Join?

Copyright © 2018 IDG Communications, Inc.

Get the best of CSO ... delivered. Sign up for our FREE email newsletters!