Patching Software: The Big Fix

1 2 Page 2
Page 2 of 2

General Magic officials say they weren't surprised by the language in the contract, but many experts say the company has to be pretty confident in its products to sign off. The effect of the contract, though, is to improve software in general. The vendor must make secure applicationsor fix them so they're secureto conform to its contract with a customer, but that makes the software better for everyone.

Clout is not limited to the Fortune 500. Sure, it's easy for GE to write such a contract, given that GE is part of the Fortune 2. And there's nothing wrong with CSOs benefiting from GE's cloutthe corporate equivalent of drafting in auto racing.

But there are other ways to force the issue with vendors for CSOs at companies smaller than GE (which is everyone but Wal-Mart). One can join the Sustainable Computing Consortium at Carnegie Mellon University, and the Internet Security Alliance, formed under the Electronic Industry Alliance. The interest groups help companies of all sizes band together on standardizing contract language and best practices for software development.

Some are taking satisfaction in a good old-fashioned boycott, even if they are so small as to escape the vendor's notice. Newnham College at the University of Cambridge in England, with 700 users, recently banned Microsoft's Outlook from use on campus because of the virus problem.

Much of the clout CSOs gain will come from the market evolving. In a sense, the software makers create clout for the CSO by asking her to deploy the product for ever more critical business tasks. At some point, the potential damage an insecure product could inflict will dictate whether it will be purchased.

"Two years ago, the marketing strategy was to just get it out there. And some of the stuff that went out was really insecure," says the anonymous ISO at the large financial institution. "But now, we just say, applications don't go live without security. It's a sledgehammer."

And it's not a randomly wielded one either. His company has created a formal process to assess vendors' applications and his own company's software development as well. It includes auditing and penetration testing, and the vendors' conforming to overarching security criteria, such as eliminating buffer overflows and so forth. It's not unusual, the security officer says, for his group to spend $40,000 per quarter testing and breaking a single application.

"Customers are vetting us," says Davidson. "Not just kicking the tires, but they're asking how we handle vulnerabilities. Where is our code stored? Do we do regression testing? What are our secure coding standards? It's impressive, but it's also just plain necessary.

"They have to be demanding. If customers don't make security a basic criteria, they lose their right to complain in a lot of ways when things go bad," she says.

At the bank, the security officer says, is a running list of vendors that are "certified"that is, they've successfully met the application security criteria by going through the formal process. The list is incentive for vendors to clean up their code, because if they're certified, they have an advantage over those that aren't the next time they want to sell software. Vendors, he says, "have either gone broke trying to satisfy our criteria, or they run through the operation pretty well. A few see what we demand and just run away. But there doesn't seem to be any middle ground."

The government is taking an active role. The image of the government in security is that of a clumsy organization tripping over its own red tape. But right now, at least in terms of application security, the government is a driving force, and the government's efforts to improve software are making a joke of the private sector.

In fact, no industry has been more effective in the past year at pushing vendors into security or using its clout (often, that comes in the form of regulation) to effect change.

At the state level, legislatures have collectively ignored the Uniform Computer Information Transactions Act (UCITA), a complex law that would in part reduce liability for software vendors (most major vendors have backed UCITA).

Federally, money has poured into the complex skein of agencies dealing with critical infrastructure protection, which has taken on a life of its own since 9/11. Equally important but not as well publicized, the feds fully implemented in July the National Security Telecommunications Information Systems Security Policy no. 11, called NSTISSP (pronounced nissTISSip), after a two-year phase-in. The policy dictates that all software that's in some way used in a national security setting must pass independent security audits before the government will purchase it.

The government has for more than a decade tried to implement such a policy, but it has been put off. Vendors have routinely been able to receive waivers through loopholes in order to avoid the process. The July move is considered a line in the sand. With national security on everyone's mind, experts believe waivers will be harder to come by. The Navy is telling kvetching vendors to use NSTISSP no. 11 as a way to gain a competitive advantage. At any rate, products will have to be secured, or the government won't buy them. Like GE's contract, this makes software better for everyone.

The ability of the public sector to whip vendors into shape on application security is best represented, though, by John Gilligan, CIO of the Air Force, who in March told Microsoft to make better products or he'll take his $6 billion budget elsewhere. It was a challenge by proxy to all software vendors. At the time, Gilligan said he was "approaching the point where we're spending more money to find patches and fix vulnerabilities than we paid for the software." And he wasn't shy about labeling software security a "national security issue."

Microsoft Chief Security Strategist Charney called himself a "nudge and a pest by nature," and he may have found his counterpart in Gilligan, who in addition to mobilizing the Air Force is encouraging other federal agencies to use similar tactics. Gilligan says he was encouraged by Bill Gates's notorious "Trustworthy Computing" memohis mea culpa proclamation in January that Microsoft software must get more securebut that "the key will be, what's the follow-through?"

Nudging Vendors

Gilligan is right, and clever, to invoke patches as a major part of his problem. If a vendor is not convinced that securing applications is a good idea after getting proof of an ROI from securing applications early, or after gaining the favor of large customers by submitting to a certification process or to a contract with strong language, then patches might do the trick.

Patches are like ridiculously complex tourniquets. They are the terrible price everyonevendors and CSOs alikepays for 30 years of insecure application development. And they are expensive. Davidson at Oracle estimates that one patch the company released cost Oracle $1 million. Charney won't estimate. But what's clear is that the economics of patching is quickly getting out of hand, and the vendors appear to be motivated to ameliorate the problem.

At Microsoft, it starts with security training, required for all Microsoft programmers as a result of Gates's memo. Michael Howard, coauthor of Writing Secure Code, and Steve Lipner, manager of Microsoft's security center (Patch Central), are running the effort to make Microsoft software more secure.

The training establishes new processes (coding through defense in depth, that is, writing your piece of code as if everything around your code will fail). It sets new rules (security goals now go in requirements documents at Microsoft; insecure drivers are summarily removed from programs, a practice that Richardson says would have been heresy not long ago). And it creates a framework for introducing Microsoft teams to the concept of managed code (essentially, reusable code that comes with guarantees about its integrity).

A year and several hundred million dollars later, it's still not clear if the two-day security training for Microsoft's developers is giving them a fish, or teaching them to fish. Richardson seems to believe the latter. She says the training starts with "religion, apple pie and how-we-have-to-save-America speeches." And, she says, it includes at least one tough lesson: "You can't design secure code by accident. You can't just start designing and think, Oh, I'll make this secure now. You have to change the ethos of your design and development process. To me, the change has been dramatic and instant."

To Microsoft customers, it's a more muted reaction. Since Gates's proclamation, gaping security holes have been found in Internet Information Server 5.0, reminding the world that legacy code will live on. Even the company's gaming console, Xbox, was crackedindicating the pervasiveness of the insecure development ethos and how hard it will be to change.

Microsoft also faces an extremely skeptical community of CSOs and other security watchdogs. Don O'Neill, executive vice president for the Center for National Software Studies, says, "When it comes to trustworthy software products, Microsoft has forfeited the right to look us in the face."

So let's end where conversations about application security usually begin: Microsoft.

Richardson's reaction to Gates's memo was not much different than anyone else's. "I wondered how much of this was a marketing issue compared with a real consumer issue," she says.

The memo has become a reference point in the evolution of application securitythe event cited as the start of the current sea change. In truth, the tides were turning for a year or more, and if a date must be given, it would be Sept. 18, 2001, one week after 9/11 and the day that the Nimda virus hit. Microsoft's entering the frayas it did with the Internet in 1995, also via a memois more an indication that the latecomers have arrived, a sort of cultural quorum call.

It was, "We're all here so let's get started," the beginning of the era of application security as a real discipline, and not an oxymoron.

Copyright © 2002 IDG Communications, Inc.

1 2 Page 2
Page 2 of 2
The 10 most powerful cybersecurity companies