Linux kernel creator Linus Torvalds' frustration over the "security circus" surrounding software vulnerabilities is understandable, but not entirely on the mark.
In an online rant last month, Torvalds wrote that "one reason I refuse to bother with the whole security circus is that I think it glorifies -- and thus encourages -- the wrong behavior. It makes 'heroes' out of security people, as if the people who don't just fix normal bugs aren't as important. In fact, all the boring normal bugs are way more important, just because there's a lot more of them."
I've long believed a lot of useless noise surrounds the flaw disclosure culture and that the findings very rarely meet doomsday expectations. In fact, the hype often distracts people from much bigger security problems. And there's no doubt the security research community has become something of a club, especially since the explosion of online social networking.
Go to a conference like Black Hat and the atmosphere resembles a club reunion. A lot of researchers are like rock stars. Many of them blog and can be found all over LinkedIn. Reporters love to be around them, including me. I missed Black Hat this year and admittedly felt a little left out.
Sometimes there's infighting over whether somebody is too slow or too eager to make a discovery public. When one researcher finds a big flaw, everyone wants to play with it and cook up their own exploit code, as the recent DNS saga clearly demonstrates.
Meanwhile, I've chatted with many a security administrator who failed to understand the media hype that often swirls around the latest big flaw. As one trusted source told me, such hoopla can blind people to a much bigger problem -- company networks that are so carelessly configured and maintained that attackers can drive a virtual truck through them without anyone noticing.
But there's a middle ground to be had here.
I talk to researchers on a regular basis and they are, for the most part, good people who want to handle their findings responsibly and be part of the solution rather than the problem. Their findings usually force the affected vendors to develop a patch and write more secure code in future versions of their product. It doesn't always work out that way, but it's usually the greater goal. The disclosure process is also a lot more civil than it once was, largely because vulnerability research has become a booming industry in itself.
In the final analysis, security professionals should be able to pay attention to flaw reports, separate the hype from the issues worth addressing, and act accordingly. Dismissing everything that comes from the research community runs you the risk of missing important information that directly affects your company and customers. Paying too much attention to the rock star imagery and eccentricities of this community can be a distraction from more important things.
The lesson is the same as it ever was: Security pros should keep an eye out for important flaw findings for the sake of due diligence and to make sure they're deploying their defenses properly. But they should also remember that a majority of the flaws will be of little danger if they rely on a layered defense.
That's what IT admins and hackers alike have told me time and again.
About FUD Watch: Senior Editor Bill Brenner scours the Internet in search of FUD - overhyped security threats that ultimately have little impact on a CSO's daily routine. The goal: help security decision makers separate the hot air from genuine action items. To point us toward the industry's most egregious FUD, send an e-mail to bbrenner@cxo.com.