In 1980, the world health Organization declared smallpox eradicated. However, by the end of this year, millions of health-care personnel and other first-responders will have to be immunized against smallpox. How does an allegedly extinct disease become a national risk 20 years later? Because the lack of vaccination has homogenized the same vulnerability into a large percentage of the population. And once a virus starts, it's hard to stop it.
This idea is just as relevant to communities of computers as it is to people, and it illustrates an unappreciated principle of systems in general and networks in particular
But, in truth, a large percent of the population does use the same computer platform. The antitrust case against Microsoft was meant to protect free trade, but an argument could be made that the government should also take steps to protect technodiversity for security's sake. Even a benevolent monopoly is dangerous because it becomes indispensable. If a virus or worm targets those ubiquitous systems, we are all affected because there is no vaccinated population able to withstand the attack.
Standardization, for all its benefits, is insidious because it enables virulent attacks to spread everywhere through common communications protocols, faster than an open-mouthed sneeze in Grand Central Station at rush hour.
Exacerbating this problem are convenience features built upon a homogenized computer environment, such as patching. Patching software used to be a low-priority task for administrators; it was common to see different releases of programs running side by side. It might have been a little bit of an administrative headache, but it actually worked as a benefit to a network's immune system
Unfortunately, today's applications upgrade themselves automatically. Bugs, glitches and holes that would have affected only early adopters or a few computers on a network can now become an epidemic before they're even spotted. The convenience of automation has led to uniformity, and uniformity in turn has enabled mass exposure to viral threats.
Diversity creates a natural firebreak for computers. I have never seen a virus that can infect both Linux and Windows boxes, and only a few can cross between Macs and PCs. In fact, the earliest warning of a network attack is often a log entry caused by one such system rejecting a virus even as the other system is infected.
I'm not advocating that companies create fully redundant hardware and software environments. That, of course, is not cost-feasible. On the other hand, it's good practice to be wary, in general, of single points of failure, whether hardware, software or human. Single-vendor solutions will always create such a weakness. What's more, homogeneity encourages sloppy internal practices by "certified" security experts who have been trained to use a specific application and who don't have the foundational expertise to adapt to new situations, to diversify.
But introducing even a token number of Unix workstations or servers forces the Windows administrative staff to learn the basics of other systems and reduces the corporate dependence on a single line of technologies.
What does it all mean? The conveniences that homogeneity and features such as patching provide might not be so great after all. It would be telling, for instance, to measure the benefits of homogeneity against one major virus attack like Slammer. Without doing calculations, it's not hard to imagine that one bout with Slammer costs more than the convenience that standard features give you over the course of a year.
Which all leads to some counterintuitive advice for the security conscious: CSOs should make an effort to slow down the rate of standardization in the enterprise. Use a combination of Linux and Windows, and don't be too quick to apply a patch unless something is already broken. Turn off automatic updates. Buy equipment from more than one manufacturer; a good mix is 70:30. There's also a cost advantage because competition drives better deals.
Hedge your bets