Americas

  • United States

Asia

Oceania

Days of Risk in 2006 : Client OS Products

Opinion
Jun 18, 20075 mins
Data and Information Security

This is the 3rd days-of-risk (DoR) posting I’ve done recently.  You may want to read the other two first:

In this post, I want to take a look that is similar to that in the previous one, but instead of taking the average of all vendor OS products, just look at the most modern client OS that was available for the full year of 2006:  Microsoft Windows XP, Red Hat Enterprise Linux 4 WS, Novell Linux Desktop 9, and Apple Mac OS X 10.4. 

Because I’m looking at individual products and because several people have indicated they thought I should also identify how many vulnerabilities are contributing to the average DoR, I will also break out the vulnerability counts, thought that is primarily just to provide assurance that the averages aren’t based upon a statistically insignificant set of vulnerabilities.

Average DoR

During 2006, the four client OS products had the following number of fixes:

  • Windows XP SP2 had 90 fixes, of which 44 were High severity
  • Red Hat rhel4ws had 301 fixes, of which 91 were High severity
  • Novell nld9 had 232 fixes, of which 74 were High severity
  • Mac OS X 10.4 had 129 fixes, of which 35 were High severity

A couple of other important caveats to keep in mind

  • I am calculating the average DoR for those vulnerabilities fixed during 2006, not just the days of risk that occurred *in* 2006.  To illustrate, imagine a vulnerability that was made public at on January 1, 2005 and fixed on December 31, 2006.  This vulnerability contributed DoR of 730 days.  If I was counting the just the days of risk that occurred in 2006, the maximum number would be 365 DoR.  I count it as 730 when contributing to the average.
  • Also, a product can’t have days-of-risk before it ships.  So, if a product shipped on 11/8/2004, but a vulnerability was public since 2002, I don’t count any of the days before 11/8/2004.

With that in mind, this first chart shows the average DoR for the client OS’ for 2006.  It is somewhat surprising to me that three of the products are so close for the average DoR spanning vulnerabilities of all severities.

Additionally, since I am charting individual products this time, it may be interesting to also look at the median DoR value for the set of products:

  • Windows XP – the median was 0 (zero), indicating that over half of the issues had a patch provided before the vulnerability was publicly disclosed. 
  • Novell NLD9 – the median was 18. 
  • Red Hat RHEL4WS – the median was 30. 
  • Mac OS X 10.4 – the median was 0 (zero), indicating that over half the issues had a patch provided before the vulnerability was publicly disclosed.

Given the relatively low values of the medians, the implication is that each of the OSes also had some set of vulnerabilities that took quite a while to fix, in order to drive the average DoR up to where they are, so we’ll look at High severity issues next and see if it looks like those with the long fix time fall out as lower severity issues.

Average DoR – High Severity

Examining just the High severity issues has a few benefits.  First, it eliminates lower severity vulnerabilities that many customer may ignore, by policy or practice.  Second, if some vendor seriously de-prioritize lower severity issues, this view will remove those lesser priority issues so that they are not bringing the average down.  Finally, some folks have raised the issue that some companies may choose to not fix certain issues that are very low severity – which is hard to capture in a DoR metric, but is largely eliminated from consideration when looking at just the High severity vulnerabilities.

Looking at the next chart, we see quite a bit of difference, indicating that vendors did in fact move to fix Higher severity issues more quickly than the average vulnerability.

I will stop there, since I’ve covered the main differences I wanted to cover, but I will finish by repeating the general sourcing and methodology that I used in both DoR blog posts.  The only real difference is that in this one, I limited the analysis to just vulnerabilities affecting the most recent vendor client products.

A Very Brief Methodology Description

I did the vulnerability compilation and analysis myself, utilizing only vulnerabilities that were confirmed from the vendor’s own security advisories.  The sources are:

If a vendor addresses the same issue multiple times on the same product, then I used the last fix date as the final date when the issue was addressed. 

For severity information, I used the US Department of Homeland Security sponsored National Vulnerability Database (NVD, http://nvd.nist.gov) as a source for independent severity ratings that were defined across all of the products.

For the dates of public disclosure, I used my own disclosure database which I have compiled over the past several years.  In general, the process is as follows for each vulnerability:

  1. Look at the CVE entry on http://cve.mitre.org or http://nvd.nist.gov (which includes the former).
  2. For each reference on that site, look at the reference page and see what date is was published or released.  Record that date and the reference.
  3. If any of the references acknowledge a researcher for the disclosure, try to find a reference to the original disclosure and use that.

Jeff Jones is a 24-year security industry professional that has spent the last several years at Microsoft helping drive security and privacy progress as part of the Trustworthy Computing group. In this role, Jeff draws upon his security experience to work with enterprise CSOs and Microsoft's internal security teams to drive practical and measurable security improvements into Microsoft process and products. Prior to Microsoft, Jeff was the vice president of product management for security products at Network Associates where his responsibilities included PGP, Gauntlet and Cybercop products, and several improvements in the McAfee product line. These latest positions cap a career focused on security, managing risk, building custom firewalls and being involved in Darpa security research projects while part of Trusted Information Systems. Jeff is a frequent global speaker and writer on security topics ranging from the very technical to more high level, CxO-focused topics such as Security TCO and metrics. Jeff is also a contributor the Microsoft Security Blog (http://blogs.technet.com/security) and writes on a wide range of personal interests (e.g. books, poker, gaming) at http://securityjones.com.