Executives know they face risks, but they often don't know which risks are real, or what that exposure means to their business.The aim of security risk management is to remove the guesswork and help the business make smarter decisions.As Jay Jacobs, vice president of the Society of Information Risk Analysts (SIRA), says, "Security risk management is simply a decision-support system for the business. It should exist to inform the decisions of the business."Unfortunately, many experts believe that most companies aren't quite there yet and that their efforts, while well-meaning, fall short and may even incorporate bad habits that can increase an organization's risk.Jeff Lowder, president of SIRA, says, "There is a mistaken perception that expertise in security equals expertise in risk management. In fact, we see many experts in security who also claim to be experts in risk management. They're often not. These are two separate disciplines and, ideally, someone is knowledgeable about both if they're performing security risk management."To get a better understanding of where many enterprises go wrong, CSO asked a handful of experts what they commonly see enterprises do wrong in security risk management. "In many organizations, based on what we've seen, it could actually be better if the organization chose to make decisions based on coin flips rather than their internal security risk management frameworks. At least when you flip a coin, you have a 50 percent chance of getting it right," says Lowder.Here are the most common mistakes and misconceptions made in well-intentioned risk management efforts:1. Starting from scratch.Many security professionals will attempt to reinvent the discipline of security risk management.Fortunately, there are well-established methods for risk-analysis tasks, such as how to solicit an expert opinion and how to represent uncertainty in risk models. However, as Jacobs and Lowder explain, most people remain unaware of the research about how to do this correctly, and end up re-creating not only the same models but also the same shortcomings that basic approaches suffer from."The most prominent model is to pick some 'risk' factors that seem important, assign some ordinal score, and then perform basic arithmetic on these or place them on a matrix that has been shown to produce poorly," says Jacobs. The only saving grace for organizations that rely on these homegrown frameworks is that experienced decision makers often distrust the results these basic approaches produce, Jacobs says.2. Replicating the audit department.One way security risk management programs set themselves up for failure, says Alex Hutton, director of operations risk and governance at a large financial services firm and faculty member at IANS, is to copy the functions of the audit department."While there are similarities between the two, the roles are dramatically different," says Hutton. The audit team should be concerned about where failures can occur through breakdowns in security controls, whereas risk management should be concerned with the potential frequency and impact of IT risks. And where audit's role is to help the company understand how to implement controls, risk management's role is to determine how to get the most out of investments in security controls and related processes."Most organizations whose risk management programs end up failing do so because they end up merely enforcing policy rather than consulting to the organization about what controls do and don't make sense," says Hutton."Audit doesn't necessarily concern itself with threat and audit doesn't necessarily care about reporting an aggregate picture of risk, based on the entire outlook of threats, assets, controls and impact," says Hutton. "Security risk management does."3. Conflating precision with accuracy.Many security professionals are uncomfortable reducing IT security risks and vulnerabilities to simple numbers."You'll hear people say that there aren't any relevant actuarial tables, or there aren't enough data regarding events to create a number that will provide value," says Lowder. "They've confused being able to give a precise numerical estimate versus being able to give a highly accurate numerical range."To provide actionable information, security risk management practitioners don't have to give numbers that predict the exact likelihood of a good or bad outcome."Numbers only need to be as precise as is necessary to make an educated decision," says Lowder. "You can create a solid argument when you show that the probability of something happening is between 60 and 90 percent," he says.4. Overemphasizing the risk register.Hutton says many organizations, when evaluating the risks they face, focus too heavily on listing and ranking all the things that could go wrong, which is called a risk register."The problem with creating a risk register is that people never know quite when to stop. They'll keep piling on risks, even the most obscure, from cyber attackers with every conceivable motivation to the possibility of a jet engine falling through the roof of the data center," he says."Very esoteric risks are things that make it into risk registers. But they're often very low-probability events that could cost a bajillion dollars to mitigate," he says. Hutton advises organizations to create an exposure register, which is more likely to reflect real-world risks and helps organizations mitigate the most probable and threatening risks first.5. Using undefined risk concepts.One of the most common ways that practitioners rank threats and vulnerabilities is on a simple scale\u2014low, medium or high. Unfortunately, that may be asking for trouble.After all, what do low, medium and high actually mean? "They actually are quantitative, without the appearance of being quantitative," says Lowder."When you ask people to define high, medium and low when applied to probability or frequency of events, nobody seems to be able to agree on what the terms actually mean. The result is you have this illusion of communication. That's more dangerous than trying to add some precision to an argument," Lowder says.For instance, when told an event's likelihood is low, some executives will think that means there's a 10 percent chance of it happening, while others will think it's 33 percent. "You don't want to use numbers for the sake of numbers, but whenever possible, you want to define things numerically so that you know you are clearly communicating," says Lowder.6. Not having a risk intelligence program."This mistake is a big one," says Hutton. "If IT security risk can be broken down to four sets of information\u2014threats, controls, assets and impact\u2014then any change to any one of those conditions would have an impact on the risk posture of an organization," he says. Unfortunately, current risk management standards spend little time describing how to put in place a risk intelligence program or the importance of that function. Nor do they explain what makes a valid source of intelligence or how to deal with new information that changes an organization's risk posture.Implementing an intelligence function is more straightforward than companies might think, says Hutton. They just need to monitor for changes that could affect their risk."For instance, you may want to monitor the organization for changes, such as if the intrusion detection\/prevention expert leaves the company and there's no one to fill the knowledge gap. That would increase risk, just as would the discovery of new OSX malware if an organization has a decent population of systems running that OS," says Hutton. And if you're not looking for these types of changes, you're not managing risk properly, many experts say.7. Multiplying ordinals"This is a key mistake to avoid," says Lowder. For instance, imagine a regatta in which boat A came in first, boat B second, and boat C third. Using only this information, it's impossible to calculate the average time for the three boats to finish the race: all we know is that boat A was faster than both boats B and C. "You can now see the fatal flaw in multiplying ordinal values or trying to calculate the average of a set of ordinal values on an ordinal scale, such as first, second, third, or high, medium, low," says Lowder.Ordinal scales simply define the rank or order of the values; they say nothing about the quantities represented by those values. "This is why the mean of a set of ordinal values is undefined. For the same reason, it is meaningless to calculate the mean for risk management factors defined as high, medium, low," Lowder says.Risk management is difficult, but doing it wrong may be worse than doing nothing at all. "You will be making decisions on bad inputs, bad processes and bad calculations. That's a formula for making a bad situation worse," says Jacobs.