If we have learned nothing else from 60-plus years of software development it's that secure, high-quality code does not happen by accident. National governments have long understood that quality and security must be specified, designed, and implemented from soup-to-nuts throughout the software development lifecycle (SDLC).
Governments go to great length to assure that only provably secure software is used to protect national secrets. Unfortunately, this philosophy never spilled over into the commercial world and what we're left with today is a precarious branch filled with brittle applications of questionable reliability.
This article address these issues and offers solutions to help you avoid repeating the same mistakes that lead to even more insecure and unreliable software and systems. We'll look at what people expect when they acquire software and what they actually get, reasons for the gaps, what you as managers can do to improve your own organization's software development practices and measure the progress of your success.
The authors' book Secure and Resilient Software Development is available on Amazon.com.
What People Expect In Software
Software is useful for what it does. People purchase software because it fulfills their need to perform some function. These functions (or features) can be as simple as allowing a user to type a letter or as complex as calculating the fuel consumption for a rocket trip to the moon. Functions and features are the reasons people purchase or pay for the development of software and it's in these terms that people think about software.
People also (erroneously) assume that the software they purchase is written with some degree of quality. When you purchase a software package you just assume it will operate as advertised and you never really think about how well the program does its job—just as long as it works!
The sad truth is that most software is flawed straight out of the box and these flaws can threaten the security and safety of the very systems on which they operate. These flaws are not just present in the traditional computers we use everyday, but also in critical devices, like our cell phones and medical devices—think pacemakers and cars—and national infrastructures like Banking and Finance, Energy, and Telecommunications.
Programmers are taught to write code—they are not taught how to write good code.
Organizations incent programmers to write more code, leaving good code out of the equation while cheap code and rapidly-developed code dominates the landscape.
Web applications especially are inherently flawed with certain types of vulnerabilities (e.g. Cross Site Scripting (XSS)) unless the developer has made a conscious effort to prevent it. If the developer fails to include appropriate output encoding routines and input validation routines, the application will most certainly be vulnerable by default to XSS.
To a developer, the software may in fact work just as he intended it to work but never tested it to see how it behaves when it's being fed malicious input or is under direct attack.
Writing software, like driving a car, is a habit. Until someone teaches us how to drive safely, we don't personally know the dangers of driving and the skills needed to prevent or avoid accidents. Cars often have safety mechanisms built into them, but as drivers, we have to consciously use our own safe driving skills. Experience teaches us that we are better off instilling safe driving skills before we let people loose on the roads since their first accident may be their last.
How Bad is the Problem, Really?
In 2008 there was significant increase in the count of identified vulnerabilities in commercial software—a 13.5 percent increase compared to 2007. The overall severity of vulnerabilities also increased, with high and critical severity vulnerabilities up 15.3 percent. Medium severity vulnerabilities were up 67.5 percent and nearly 92 percent of 2008 vulnerabilities reported can be exploited remotely.
Of all the vulnerabilities disclosed in 2008, only 47 percent are able to be corrected through vendor patches. At the end of 2008, 74% of all web application vulnerabilities disclosed in 2008 had no available patch to fix them.
Web applications are the Achilles Heel of Corporate IT Security too. Every year organizations across the globe spend millions of dollars on securing their software infrastructure—a significant portion of an entire year's IT budget. Software security spending tends to focus on detecting existing vulnerabilities in the software organizations already own and finding ways to reduce the associated risks of using it. Rewriting software, whether to fix a problem or fundamentally change what the software does, also results in tremendous corporate expenditures year after year. Bad code also negatively affects productivity whenever the application or a system goes down, leading to indirect or direct losses to the business. Other Web applications flaws lead to brand damage, reputation damage, information theft, and denial of service problems. [Editor's note: For more, see 'Broken windows revisited: Why insecure software hurts the global economy' by David Rice.]
Don't Bolt Security On—Build It In!
Software flaws appear in software because somewhere along the specification, development, and testing conveyor belt, requirements that mandate secure software fell on the floor and were neglected. Software is only secure when it's designed for security—most attempts at bolting on security after the fact yield even more problems than when it's considered from the beginning of a development effort.
To actually move the needle on progress for software security requires that security be built in to the development life cycle itself and begins with a management sponsored Secure Coding Initiative to fundamentally improve how an organization thinks about and developments software for public use, for in-house use, and for sale to others.
Any secure coding initiative must deal with all stages of a program's lifecycle. Secure programs are secure by design, during development, and by default. As with any significant change initiative, education plays a key role. Education is the cornerstone to any effective intervention to improve software quality and should be treated as non-optional. Prior to relying on the people involved in software development for secure code, it's essential that training in secure design and coding is administered to help the analysts, designers, and developers to better understand how incomplete analysis, poor design decisions, and poor coding practices contribute to application and system vulnerabilities.
Once educated, SDLC participants begin behaving differently and start to serve as an internal checks-and-balances mechanism that catches problems earlier in the SDLC and leads to lower costs of development and higher quality systems.
Education needs to prepare personnel in the basics of Web application flaws, familiarize them with the hacking tools of the trade intended to break application software, and preparing them to carry out their craft with new skills and techniques.
From the earliest days of software development, studies have shown that the cost of remediating vulnerabilities or flaws in design are far lower when they're caught and fixed during the early requirements/design phases than they are after launching the software into production. Therefore, the earlier you integrate security processes with the development life cycle, the cheaper software development becomes in the long haul.
These security processes are often just "common sense" improvements and any organization can and should adopt them into their existing environment. There is no one right way to implement these processes—each organization will have to fine tune and customize them for their specific development and operating environments. These process improvements add more accountability and structure into the system too. There are a number of well-accepted secure software development methodologies, including Microsoft's Secure Development Lifecycle (SDL), Cigital's Touchpoints, and the one we'll examine here, called CLASP.
Comprehensive, Lightweight Application Security Process (CLASP)
The Open Web Application Security Project (OWASP) is an open community dedicated to enabling organizations to conceive, develop, acquire, operate, and maintain applications that can be trusted. One of their more prominent projects is called the Comprehensive, Lightweight Application Security Process, or CLASP.
CLASP is a pre-defined set of documented processes and tools that can be integrated into any software development process. It is designed to be both easy to adopt and effective.
CLASP uses a prescriptive approach, documenting activities that organizations should be doing. CLASP is rich with an extensive collection of freely available and open source security resources that make implementing those activities practical and achievable.
Think of CLASP as a resource library to avoid re-inventing the wheel when you come across the need for new processes or new ideas for secure software development.
CLASP provides extensive detailed information on:
- Concepts behind CLASP to get started
- Seven key Best Practices that define CLASP
- High-level Security Services that serve as a foundation
- Core Security Principles for software development
- Abstract Roles that are typically involved in software development
- Activities to augment the development process to build more secure software
- CLASP Process Engineering and Roadmaps
- Coding Guidelines to help developers and auditors when reviewing code
- A lexicon of Vulnerabilities that occur in source code
- A searchable Vulnerability Checklist in Excel format for the CLASP Vulnerability lexicon
You can obtain a free copy of CLASP in book form or from the OWASP CLASP Wiki.
Implementing software application security best practices requires a reliable process to guide a development team in creating and deploying an application that is as resistant as possible to security attacks. Within a software development project, the CLASP Best Practices are the basis of all security-related software development activities. Here are the seven CLASP best practices:
- Best Practice 1: Institute awareness programs
- Best Practice 2: Perform application assessments
- Best Practice 3: Capture security requirements
- Best Practice 4: Implement secure development practices
- Best Practice 5: Build vulnerability remediation procedures
- Best Practice 6: Define and monitor metrics
- Best Practice 7: Publish operational security guidelines
Essential security concepts and techniques may be foreign to your organization's software developers and others involved in application development and deployment. So it is imperative at the outset to educate everyone involved. Awareness programs can be readily implemented, using external expert resources, as appropriate, and deliver a high return by helping to ensure that other activities promoting secure software will be implemented effectively. Here are some tips to help with security awareness:
Provide security training to all team members
Before team members can reasonably be held accountable for security issues, you must ensure they have had adequate exposure to those issues. This is best done with a training program. Everyone on the team should receive training introducing them to basic security concepts and secure development process that is used within the organization.
Promote awareness of the local security setting
Everyone on a development project should be familiar with the security requirements of the system, including the basic threat model. When such documents are produced, they should be distributed and presented to team members, and you should solicit and encourage feedback from all parties on the team.
Institute accountability for security issues
Traditional accountability within development organizations is based primarily on schedule and quality. Security should be treated in much the same way as any other quality consideration.
Appoint a project security officer
An excellent way to increase security awareness throughout the development lifecycle is to designate a team member as the project security officer, particularly someone who is enthusiastic about security.
Institute rewards for handling of security issues
Accountability is a necessity for raising security awareness, but another highly effective way is to institute reward programs for doing a job well done with regard to security. For example, it is recommended to reward someone for following security guidelines consistently over a period of time—particularly if the result is that no incidents are associated with that person.
Testing and assessment functions are typically owned by a test analyst or by the QA organization but can span the entire SDLC. CLASP offers detailed guidance for each of the following areas of application assessments: