Go for the gold!

Using continuous improvement and maturity models to build effective security programs.

claressa shields olympics gold
Credit: REUTERS/Peter Cziborra

This was the most exciting Olympics I have seen in a long time. Two weeks ago, we pulled over to a pizza place on a drive to Purdue University to watch Usain Bolt win the 100 meters. Seeing these remarkable athletes, I couldn’t help wondering how they got to the levels of achievement they demonstrate. So I researched it. Almost all of the careers I looked at showed the importance of continuous improvement. This method can also help you achieve excellence in your security program, if not to the Olympic stage.

In Simone Biles’ first competition in 2011 she place third all-around. In her next she placed 20th all-around. The next year she increased her training regimen and started on the path to gold. How about the aforementioned Bolt? His early sprint career was marked by both spectacular wins and losses. His coach Glen Mills advised him “…to learn to lose, because by doing so you could figure out what you needed to do to win” (Usain Bolt: 9.58, by Usain Bolt). Nastia Liukin, a 2008 gold medal gymnast, recommends that you “…strive to achieve something on a day-to-day basis”.

These stories and many others have inspired me this summer. So where does continuous improvement fit into today’s security program? In the beginning, we had SOX compliance (2002), PCI compliance (2004) and HIPAA compliance (2005). There was no continuous improvement because you were expected to meet all requirements in year one and beyond. The one exception was ISO 27001 which has always required continuous improvement of the security management system. However, ISO 27001 has never been accepted widely in the US.

Academic institutions continued to pursue ideas of continuous improvement and maturity levels. These have included Carnegie Mellon with CMMI for software systems development and Maynooth University with its IT Capability Maturity Framework (IT-CMF). Some security professionals like Jeff Bardin had previously highlighted the importance of maturity models for information security.

Things have changed today with several security frameworks including continuous improvement and maturity models as a core part of their structure. These include the FFIEC CAT (FFIEC Cybersecurity Assessment Tool); DOE C2M2 (Cybersecurity Capability Maturity Model); Educause Information Security Program Assessment Tool; and NIST CSF (Cyber Security Framework). All of these utilize continuous improvement metrics; they each have unique features as well. The NIST CSF has 98 control objectives organized into five domains and includes four maturity levels, called “tiers”.

The Educause assessment tool has 101 control objectives, organized into 15 domains and incorporate six levels of maturity. It is based on ISO 27002. The FFIEC CAT has 30 security “components” organized into five domains and five maturity levels. Uniquely, the CAT also has a risk assessment tool that provides a formal, semi-quantitative template for analyzing risk. Since the FFIEC regulates banks, this tool is designed for banking institutions. Finally, the DOE C2M2 provides an assessment tool for all types of organizations, not just energy sector firms. It is organized into 10 domains, with 37 high level control objectives and four maturity levels.

An example of the use of the maturity level approach is shown in the figure. This radar chart shows an assessment based on ISO 27002 (2005 version). The assessment shows a poor management system (27001 ISMS) maturity, excellent physical security and good business continuity plan. In addition to measuring and tracking these maturity levels and the overall average level (2.3 for this organization), this diagram shows the importance of tracking the lowest maturity level. Security is only as good as the weakest link and the radar chart shows that, for this firm, it is a lack of up to date security policies.


Many professionals think of security metrics as things like percent of systems patched, percent of new projects passing a risk analysis gate, percent of employees taking awareness training in the last year, and so on. These are actually KRIs (key result indicators), whereas maturity levels are KPIs, or key performance indicators. The KPIs tell you how your security program will do in the future; the KRIs tell you how it is doing now. So track and improve your KPIs to have the best chance of getting to gold… and avoiding a damaging breach. It’s only four years to Tokyo 2020!

This article is published as part of the IDG Contributor Network. Want to Join?

Cybersecurity market research: Top 15 statistics for 2017