Cognitive bias: The risk from everyone in your organization, including you

The only way to manage it is to acknowledge it, and take steps to minimize it.

BOSTON - Risks to enterprises are not only of the security breach variety from outside attackers, malicious insiders or even careless employees. Another comes from everybody in an organization – even its most loyal, careful, capable members. 

"Everybody has biases – no one is immune,” said Benjamin Brown, information security program manager at Akamai Technologies, in a presentation Tuesday at SOURCE Boston titled, “Cognitive Bias and Critical Thinking in Open Source Intelligence (OSINT).” OSINT, he said, is “intelligence produced by publicly available information.” OSINT is useful for numerous things, he said, including analysis of attack vectors and safety concerns, recognizing leaks and breaches, conducting privacy audits, understanding threat actors, gathering competitive intelligence and conducting due diligence when evaluating prospective hires, clients and partners. But that also means any bias in analyzing it matters a lot, he added, because it can lead to false conclusions that in some cases could unwittingly damage an organization and in others lead to unwarranted FUD (Fear, Uncertainty and Doubt)

Fortunately, there are ways to manage it, but that takes a willingness to admit that one is vulnerable to it – which can sometimes be difficult. Brown said part of the problem is that people are naturally inclined to think others are more biased, and that they are less so. And, it is easy – especially when using “open-source” resources such as search engines, social networks, e-commerce sites and mainstream media – to get sucked in to seeking out evidence that supports an initial hypothesis while ignoring or discounting evidence that might challenge or undermine it. This is “confirmation bias,” he said, likening it to the “Texas sharpshooter fallacy,” in which a shooter fires a hail of bullets at the side of a barn, then goes up to the wall, finds the tightest grouping of bullets and then draws a bulls eye over it. That kind of bias tends to avoid or refuses to accept information that would support competing hypotheses. Another he called the “echo effect,” where a hypothesis is picked up and repeated, to the point where the original source of it is obscured and it becomes an exercise in groupthink rather than rigorous, objective analysis. Another problem is that all open sources are not equally credible. “The quality of open-source intelligence is highly variable,” he said. What to do? Be skeptical, especially of yourself, Brown said, adding that vendors particularly need to be aware of possible conflicts of interest in their assumptions. “Beware the belief that your bias is actually insight,” he said. “Ask yourself: ‘What do I think I know, how do I think I know it, and when would it be false?’” he said, adding that “when you don’t know what you don’t know,” you have a bias problem. Brown said the way to eliminate as much bias as possible from an evaluation is to “define the problem, identify all possible hypotheses, collect information and evaluate the hypotheses.” In the evaluation process, he said it is important to take a contrarian, devil’s advocate view of each hypothesis –even those seen as most likely to be correct – check all assumptions and their origins and also consult peers and outside experts, who might notice things you, or your team, are missing. Finally, he recommended choosing the hypothesis that has, “the least evidence against it,” to keep conclusions tentative while continuing to collect data and forming alternate hypotheses and to consider your organization’s goals and customers along its costs in time and personnel. And, if you want your conclusions to have some credibility, “show your methodology,” he said, noting that too many reports, on everything from the APTs to DDoS attacks and more, issue sensational conclusions, “with no methodology shown on how they reached them.”

Copyright © 2014 IDG Communications, Inc.

Microsoft's very bad year for security: A timeline