Andrew Jaquith is a Yankee Group analyst and founder of discussion site Securitymetrics.org. The following excerpt is taken from his current book, Security Metrics: Replacing Fear, Uncertainty, and Doubt. A few years ago my former employer was called in by the CTO of a large, well-known maker of high-end consumer electronics. This company, which prides itself on its progressive approach to IT management, operates a large, reasonably up-to-date network and a full suite of enterprise applications. The CTO, Barry Eiger (a pseudonym), an extremely smart man, is fully conversant in the prevailing technology trends of the day. In manner and in practice, he tends to be a conservative technology deployer. Unimpressed with fads and trends, he prefers to hydrofoil above the choppy technological seas with a slightly bemused sense of detachment. Facts, rather than the ebbs and flows of technology, weigh heavily in his decision-making. In our initial conversations, he displayed an acute awareness of industry IT spending benchmarks. We discovered later that he had spent significant sums of money over the years on advisory services from Gartner Group, Meta Group, and others. If he is so well informed, why did he call us in, I wondered? Barry’s problem was simple. His firm had historically been an engineering-driven company with limited need for Internet applications. More recently, his senior management team had asked him to deploy a series of transactional financial systems that would offer customers order management, loan financing, and customer support services. These public-facing systems, in turn, connected back to several internal manufacturing applications as well as to the usual suspects—PeopleSoft, SAP, Siebel, and Oracle. A prudent man, Barry wanted to make sure his perimeter and application defenses were sufficient before beginning significant deployments. He wanted to know how difficult it might be for an outsider to penetrate his security perimeter and access sensitive customer data, product development plans, or financial systems. Barry asserted that his team had done a good job with security in the past. “What if you can’t get in?” he asked rhetorically. Despite his confidence, his dull ache persisted. His nagging feeling compelled him to find out how good his defenses really were. He also wanted to get some benchmarks to see how well his company compared to other companies like his. Barry wanted a McKinsey-style “diagnostic.” This kind of diagnostic first states an overall hypothesis related to the business problem at hand and then marshals evidence (metrics) that supports or undermines the theory. The essence of the McKinsey diagnostic method is quite simple: The analysis team identifies an overall hypothesis to be supported. Example: “The firm is secure from wireless threats by outsiders.” The team brainstorms additional subhypotheses that must hold for the overall hypothesis to be true. For example, to support the wireless hypothesis we just identified, we might pose these subhypotheses: “Open wireless access points are not accessible from outside the building” and “Wireless access points on the corporate LAN require session encryption and reliable user authentication.” The team examines each subhypothesis to determine if it can be supported or disproved by measuring something. If it cannot, the hypothesis is either discarded or decomposed into lower-level hypotheses. For each lowest-level hypothesis, the team identifies specific diagnostic questions. The answers to the questions provide evidence for or against the hypothesis.Diagnostic questions generally take the form of “The number of X is greater (or less) than Y” or “The percentage of X is greater (or less) than Y.” For example, “There are no open wireless access points that can be accessed from the building’s parking lot or surrounding areas” or “100% of the wireless access points on the corporate LAN require 128-bit WPA security.” The diagnostic questions dictate our metrics. The primary benefit of the diagnostic method is that hypotheses are proven or disproven based on empirical evidence rather than intuition. Because each hypothesis supports the other, the cumulative weight of cold, hard facts builds a supporting case that cannot be disputed. A secondary benefit of the diagnostic method is that it forces the analysis team to focus only on measurements that directly support or disprove the overall hypothesis. Extraneous “fishing expeditions” about theoretical issues that cannot be measured automatically filter themselves out. So far, the sample hypotheses and diagnostic questions I have given are rather simplistic. Why don’t we return to our friend Barry’s company for a real-world example? Recall that Barry’s original question was “Is my company’s customer data secure from outside attack?” Our overall hypothesis held that, indeed, the company was highly vulnerable to attack from outsiders. To show that this statement was true (or untrue), we constructed subhypotheses that could be supported or disproven by asking specific questions whose answers could be measured precisely and empirically. The table above shows a subset of the diagnostics we employed to test the hypothesis. Note that these diagnostics do not exhaust the potential problem space. Time and budget impose natural limits on the diagnostics that can be employed. To answer the diagnostic questions we posed, we devised a four-month program for Barry’s company. We assessed their network perimeter defenses, internal networks, top ten most significant application systems, and related infrastructure. When we finished the engagement and prepared our final presentation for Barry, his team, and the company’s management, the metrics we calculated played a key role in proving our hypothesis. The evidence was so compelling, in fact, that the initial engagement was extended into a much longer corrective program with a contract value of several million dollars. Related content news UK government plans 2,500 new tech recruits by 2025 with focus on cybersecurity New apprenticeships and talent programmes will support recruitment for in-demand roles such as cybersecurity technologists and software developers By Michael Hill Sep 29, 2023 4 mins Education Industry Education Industry Education Industry news UK data regulator orders end to spreadsheet FOI requests after serious data breaches The Information Commissioner’s Office says alternative approaches should be used to publish freedom of information data to mitigate risks to personal information By Michael Hill Sep 29, 2023 3 mins Government Cybercrime Data and Information Security feature Cybersecurity startups to watch for in 2023 These startups are jumping in where most established security vendors have yet to go. By CSO Staff Sep 29, 2023 19 mins CSO and CISO Security news analysis Companies are already feeling the pressure from upcoming US SEC cyber rules New Securities and Exchange Commission cyber incident reporting rules don't kick in until December, but experts say they highlight the need for greater collaboration between CISOs and the C-suite By Cynthia Brumfield Sep 28, 2023 6 mins Regulation Data Breach Financial Services Industry Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe