Biased software vulnerability stats praising Microsoft were 101% misleading

Don't believe the hogwash sums up the Open Source Vulnerability Database review of the entire Secunia Vulnerability Report, adding that the Microsoft-centric report was flawed, inaccurate and convoluted.

If you about fell out of your chair when you saw the annual Secunia Vulnerability Review, which blamed third-party software, not Microsoft's, for 76% of the vulnerabilities on the average PC, then you were not the only one.  

The Open Source Vulnerability Database, OSVDB, has a motto of "Everything is Vulnerable;" the more you know about security, reverse engineering and hacking, the more absolutely true you realize that is. Last week, OSVDB reviewed the Secunia 2013 Vulnerability Review and then contacted me to say Secunia's 2013 report included statistics that were questionable or downright wrong. The problem, according to OSVDB, starts with the methodology for counting vulnerabilities so that the vulnerability count results "cannot be realistically duplicated." With that wrong, it throws everything off from there.

It's always a bit baffling how some organizations come up with their stats; cybercrime losses, for example, are generally bloated and a bunch of bunk. If a vulnerability report is misleading, then I can only imagine the amount of aggravation it causes some people, such as the gentlemen who presented "Buying Into the Bias: Why Vulnerability Statistics Suck" at Black Hat 2013.

At that time, Jericho, the content manager of the Open Source Vulnerability Database (OSVDB), and Steve Christie, the editor of the Common Vulnerabilities and Exposures (CVE) list, announced, "Most of these statistical analyses are faulty or just pure hogwash. They use the easily-available, but drastically misunderstood data to craft irrelevant questions based on wild assumptions, while never figuring out (or even asking us about) the limitations of the data. This leads to a wide variety of bias that typically goes unchallenged, that ultimately forms statistics that make headlines and, far worse, are used for budget and spending."

During their presentation, they added, "As maintainers of two well-known vulnerability information repositories, we're sick of hearing about sloppy research after it's been released, and we're not going to take it any more."

With that in mind, I'd really like to share some of what OSVDB's Jericho said about the Secunia vulnerability report. While I can't give you stats for stats, because it's like trying to compare grapes to peanut butter, let's look at the differing definitions of "third-party" software. Secunia claimed "the findings in the Secunia Vulnerability Review 2014 support that, once again, the biggest vulnerability threat to corporate and private security comes from third-party - i.e. non-Microsoft - programs."

OSVDB wrote:

The notion that "non-Microsoft" software is "third-party" is very weird for lack of better words, and shows the mindset and perspective of Secunia. This completely discounts users of Apple, Linux, VMs (e.g. Oracle, VMware, Citrix), and mobile devices among others. Such a Microsoft-centric report should clearly be labeled as such, not as a general vulnerability report.

Clearly a different definition for third-party software will throw everything off when trying to make a comparison, but the methodology for counting vulnerabilities will too. OSVDB explained that Secunia missed some vulnerabilities altogether and duplicated others, making the numbers "incorrect and entirely misleading." Take, for example, this case where "a protocol is found to have a vulnerability, such as the 'TLS / DTLS Protocol CBC-mode Ciphersuite Timing Analysis Plaintext Recovery Cryptanalysis Attack' (OSVDB 89848). This one vulnerability impacts any product that implements that protocol, so it is expected to be widespread. As such, that one vulnerability tracks to 175 different Secunia advisories."

If you'd still like real numbers to numbers, then check this out: Secunia said, "The actual vulnerability count in Microsoft programs was 192 in 2013; 128.6% higher than in 2012." However, OSVDB said, "Based on our data, there were 363 vulnerabilities in Microsoft software in 2013, not 192. This is up from 207 in 2012, giving us a 175.3% increase.

Here are a few more key points from OSVDB's review:

  • There is a high rate of duplicates and lack of unique identifiers in the Secunia report and this makes the data set they used too convoluted for meaningful statistics.
  • The Secunia Vulnerability Criticality Classification system does not help organizations truly understand the risk to their organization.
  • The report makes statements about "third party programs" in broad terms, that are inaccurate and focus heavily if not solely on Windows.
  • Due to the flawed methodology used to generate the statistic it cascades down into a wide variety of other incorrect conclusions.

OSVDB has a detailed post disputing Secunia's report. I highly recommend for you to read "Reviewing the Secunia 2013 Vulnerability Review" in full, but OSVDB ends with:

In conclusion, while we appreciate companies sharing vulnerability intelligence, the Secunia 2013 vulnerability report is ultimately fluff that provides no benefit to organizations. The flawed methodology and inability for them to parse their own data means that the conclusions cannot be relied upon for making business decisions. When generating vulnerability statistics, a wide variety of bias will always be present. It is absolutely critical that your vulnerability aggregation methodology be clearly explained so that results are qualified and have more meaning.

Like this? Here's more posts:

Follow me on Twitter @PrivacyFanatic

Cybersecurity market research: Top 15 statistics for 2017