Let's start where conversations about software usually end: Basically, software sucks.
In fact, if software were an office building, it would be built by a thousand carpenters, electricians and plumbers. Without architects. Or blueprints. It would look spectacular, but inside, the elevators would fail regularly. Thieves would have unfettered access through open vents at street level. Tenants would need consultants to move in. They would discover that the doors unlock whenever someone brews a pot of coffee. The builders would provide a repair kit and promise that such idiosyncrasies would not exist in the next skyscraper they build (which, by the way, tenants will be forced to move into).
Strangely, the tenants would be OK with all this. They'd tolerate the costs and the oddly comforting rhythm of failure and repair that came to dominate their lives. If someone asked, "Why do we put up with this building?" shoulders would be shrugged, hands tossed and sighs heaved. "That's just how it is. Basically, buildings suck."
The absurdity of this is the point, and it's universal, because the software industry is strangely irrational and antithetical to common sense. It is perhaps the first industry ever in which shoddiness is not anathema
The only thing more shocking than the fact that Kawasaki's iconoclasm passes as wisdom is that executives have spent billions of dollars endorsing it. They've invested
"We've developed a culture in which we don't expect software to work well, where it's OK for the marketplace to pay to serve as beta testers for software," says Steve Cross, director and CEO of the Software Engineering Institute (SEI) at Carnege Mellon University. "We just don't apply the same demands that we do from other engineered artifacts. We pay for Windows the same as we would a toaster, and we expect the toaster to work every time. But if Windows crashes, well, that's just how it is."
Application security
A complex set of factors is conspiring to create a cultural shift away from the defeatist tolerance of "that's just how it is" toward a new era of empowerment. Not only can software get better, it must get better, say executives. They wonder, Why is software so insecure? and then, What are we doing about it?
In fact, there's good news when it comes to application security, but it's not the good news you might expect. In fact, application security is changing for the better in a far more fundamental and profound way. Observers invoke the automotive industry's quality wake-up call in the '70s. One security expert summed up the quiet revolution with a giddy, "It's happening. It's finally happening."
Even Kawasaki seems to be changing his rules. He says security is a migraine headache that has to be solved. "Don't tell me how to make my website cooler," he says. "Tell me how I can make it secure."
"Don't worry, be crappy" has evolved into "Don't be crappy." Software that doesn't suck. What a revolutionary concept.
Why Is Software So Insecure?
Software applications lack viable security because, at first, they didn't need it. "I graduated in computer science and learned nothing about security," says Chris Wysopal, technical director at security consultancy @Stake. "Program isolation was your security."
The code-writing trade grew up during an era when only two things mattered: features and deadlines. Get the software to do something, and do it as fast as possible. Cyra Richardson, a developer at Microsoft for 12 years, has written code for most of the company's major pieces of software, including Windows 3.1. "The measure of a great app then was that you did the most with the fewest resources"
Networking changed all that. It allowed someone to hack away at your software from somewhere else, mostly undetected. But it also meant that more people were using computers, so there was more demand for software. That led to more competition. Software vendors coded frantically
Now, features make software do something, but they don't stop it from unwittingly doing something else at the same time. E-mail attachments, for example, are a feature. But e-mail attachments help spread viruses. That is an unintended consequence
As networking spread and featureitis took hold, some systems were compromised. The worst case was in 1988 when a graduate student at Cornell University set off a worm on the ARPAnet that replicated itself to 6,000 hosts and brought down the network. At the time, events like that were the exception.
By 1996, the Internet supported 16 million hosts. Application security
Even today, the software development methodologies most commonly used still cater to deadlines and features, and not security. "We have a really smart senior business manager here who controls a large chunk of this corporation but hasn't a clue what's necessary for security," says an information security officer at one of the largest financial institutions in the world. "She looks at security as, Will it cost me customers if I do it? She concludes that requiring complicated, alphanumeric passwords means losing 12 percent of our customers. So she says no way."
Software development has been able to maintain its old-school, insecure approach because the technology industry adopted a less-than-ideal fix for the problem: security applications, a multibillion-dollar industry's worth of new code to layer on top of programs that remain foundationally insecure. But there's an important subtlety. Security features don't improve application security. They simply guard insecure code and, once bypassed, can allow access to the entire enterprise.
That's triage, not surgery. In other words, the industry has put locks on the doors but not on the loading dock out back. Instead of securing networking protocols, firewalls are thrown up. Instead of building e-mail programs that defeat viruses, antivirus software is slapped on.
When the first major wave of Internet attacks hit in early 2000, security software was the savior, brought in at any expense to mitigate the problem. But attacks kept coming, and more recently, security software has lost much of its original appeal. That
In addition, a bevy of new research was published that proves there is an ROI for vendors and users in building more secure code. Plus, a new class of software tools was developed to automatically ferret out the most gratuitous software flaws.
Put it all together, and you get
Mary Ann Davidson, CSO at Oracle, claims that now "no one is asking for features; they want information assurance. They're asking us how we secure our code." Adds Scott Charney, chief security strategist at Microsoft, "Suddenly, executives are saying, We're no longer just generically concerned about security."
So What Are We Doing About It?
Specifically, all this concern has led to the empowerment of everyone who uses software, and now they're pushing for some real application security. Here are the reasons why.
Vendors have no excuse for not fixing their software because it's not technically difficult to do. For anyone who bothers to look, the numbers are overwhelming: 90 percent of hackers tend to target known flaws in software. And 95 percent of those attacks, according to SEI's Cross, among others experts, exploit one of only seven types of flaws. (See "Common Vulnerabilities," opposite page.) So if you can take care of the most common types of flaws in a piece of software, you can stop the lion's share of those attacks. In fact, if you eliminate the most common security hole of all
"It frustrates me," says Cross. "It was kind of chilling when we realized half-a-dozen vulnerabilities were causing most of the problems. And it's not complex stuff either. You can teach any freshman compsci student to do it. If the public understood that, there would be an outcry."
SEI and others such as @Stake are shining a light on these startling facts (and making money in doing so). It has started to have an effect. Wysopal at @Stake says he's seeing more empowered and proactive customers, and in turn, vendors are desperately seeking ways to keep those empowered customers.
"It's been a big change," he says. "We still get a lot of [customers saying], We're shipping in a week. Could you look at the app and make sure it's secure? But we're seeing more clients sooner in the development process. Security always was the thing that delayed shipment, but they've started to see the benefits
In fact, it's a little more complicated than that. Even if, starting tomorrow, no new programs contained buffer overflows (and, of course, it will take years of training and development to minimize buffer overflows), there's billions of lines of legacy code out there containing 300 variations on the buffer-overflow theme. What's more, in a program with millions of lines of code, there are thousands of instances of buffer overflows. They are needles in a binary haystack.
Fortunately, some enterprising companies have built tools that automate the process of finding the buffers and fixing the software. The class of tool is called secure scanning or application scanning, and the effect of such tools could be profound. They will allow CSOs to, basically, audit software. They've already become part of the security auditing process, and there's nothing to stop them from becoming part of the application sales process too. Wysopal tells the story of a CSO who brought him a firewall for vulnerability testing and scanning. When a host of serious flaws were found, the customer literally sent the product back to the vendor and, in so many words, said, If you want us to buy this, fix these vulnerabilities. To preserve the sale, the vendor fixed the firewall.
Strong contracts are making software better for everyone. According to @Stake research, vendors should realize that there's an ROI in designing security into software earlier rather than later. But Wysopal believes that's not necessarily the only motivation for companies to improve their code's safety. "I think they also see the liability coming," he says. "I think they see the big companies building it into contracts."
A contract GE signed with software vendor General Magic Inc. earlier this year has security officers and experts giddy and encouraged by its language (see "Put It in Writing," this page). In essence it holds General Magic fully accountable for security flaws and dictates that the vendor pay for fixing the flaws.