Days-of-risk in 2006 : Linux, Mac OS X, Solaris and Windows

With my Basic Guide to Days of Risk to provide an introduction, I wanted to do a check up on days-of-risk for 2006 for Windows and the main Enterprise OS distributions.  Also, I thought it would be interesting to see if the vendors really do prioritize High severity issues, and if so, how much over the average for all issues.

Average OS Days-of-Risk

The first comparison I wanted to look at was to see how the vendors did in general for security response across their Operating System (OS) products.  Because many customers that have selected a vendor OS have deployments that may cross multiple versions, this view looks at the average security response time, in terms of DoR, across the supported product set.  By vendor, here are the products included:

  • Apple:  Mac OS X, any version patched in 2006
  • Microsoft:  Windows 2000 (Professional and Server), and Windows XP, Windows Server 2003.  Windows Vista is not included since it was only available for one month in 2006 and had no fixes.
  • Red Hat:  Red Hat Enterprise Linux 2.1, Red Hat Enterprise Linux 3, and Red Hat Enterprise Linux 4
  • Novell:  SUSE Linux Enterprise Server 8, SUSE Linux Enterprise Server 9, SUSE Linux Enterprise Server 10, Novell Linux Desktop 9, and SUSE Linux Enterprise Desktop 10
  • Sun:  Any Solaris version patched in 2006

Note that for the process, I took a union of the vulnerabilities and combined any vulnerability into one instance if it was fixes on multiple OS versions on the same day.  For example, if a public vuln was fixed in SLES9 and SLES10 on the same day, it counted as only one instance.  If the same vuln was fixed in SLES9 and SLES10 on different days, say a week apart, then it counted as two different instances in calculating the averages. 

If one vuln was fixed in multiple components of a single product on different days, then the vuln was only considered fixed once the final component was fixed.  For example, if a vulnerability was public on January 1st that affected both Firefox and Thunderbird in RHEL3 and a patch was released for Firefox on January 10th and Thunderbird on January 15th, this would count as one instance for RHEL3 have 15 DoR (and not 2 instances with lengths of 10 and 15, respectively).

2006 Days of Risk

We see in this first chart of the average Days-of-Risk that during 2006, Microsoft provided fixes for publicly disclosed vulnerabilities the quickest on average at about 29 days and Sun came in at the far end with the highest average DoR.

Average OS DoR - High Severity

As commented upon by the Linux vendors a common statement regarding the Forrester DoR study: Each vulnerability gets individually investigated and evaluated; the severity of the vulnerability is then determined ... This severity is then used to determine the priority at which a fix for a vulnerability is being worked on ... This prioritization means that lower severity issues will often be delayed to let the more important issues get resolved first."

I know that Microsoft also gives priority to more severe issues and imagine that most vendors do, so if we look at the average DoR for only High Severity issues, we should see a decrease.  And indeed we do, with the exception of Sun.

2006 DoR High Severity Only

Year-over-Year DoR Trend

Next, I thought it would be interesting to see if the 2006 DoR values were increasing, decreasing or about the same as the previous year.  I didn't have any idea how this one would turn out, but it is interesting to note that while the average DoR got slightly longer for all of the vendors except Apple, they also kept within a relatively narrow band.

2006 over 2005 DoR trend

I'll stop there for now, but I plan to follow up next week with a look to see if vendors give priority to newer products over older products or if they get treated more uniformly.

A Very Brief Methodology Description

I did the vulnerability compilation and analysis myself, utilizing only vulnerabilities that were confirmed from the vendor's own security advisories.  The sources are:

If a vendor addresses the same issue multiple times on the same product, then use the last fix date as the final date when the issue was addressed.  If a vulnerability is fixed in different products on different days, then track them as separate instances.

For severity information, I used the US Department of Homeland Security sponsored National Vulnerability Database (NVD, http://nvd.nist.gov) as a source for independent severity ratings that were defined across all of the products.

For the dates of public disclosure, I used my own disclosure database which I have compiled over the past several years.  In general, the process is as follows for each vulnerability:

  1. Look at the CVE entry on http://cve.mitre.org or http://nvd.nist.gov (which includes the former).
  2. For each reference on that site, look at the reference page and see what date is was published or released.  Record that date and the reference.
  3. If any of the references acknowledge a researcher for the disclosure, try to find a reference to the original disclosure and use that.

Copyright © 2007 IDG Communications, Inc.

8 pitfalls that undermine security program success