Traditionally\u2014if such a word can apply to the rapidly morphing digital world\u2014companies have secured their web applications by guarding the perimeter with Web firewalls. However, the ever-growing realization is that the real vulnerability lies in the Web applications themselves, which often contain easily exploited security flaws. According to consultancy Gartner, 90 percent of externally accessible applications today are Web-enabled, and two-thirds of them have exploitable vulnerabilities.That\u2019s where Web application penetration testing tools and services come in. Diana Kelley, VP and service director at Burton Group, says these tools and services conduct automated scans of Web applications that are either in production or just prior to going live, applying threat models and misuse cases to unearth common vulnerabilities. Some of the top 10 flaws defined by the Open Web Application Security Project (OWASP), including SQL injection, cross-site scripting and improper error handling, were until quite recently alien concepts to a lot of people, including developers. In some cases, the tools provide suggested parameters for how to fix these types of problems.Today, Web penetration testing is considered a key component in ensuring application security, which has become an essential part of enterprise risk management, Kelley says. Or as Joseph Fieman, analyst at Gartner, puts it, \u201cIt\u2019s coming down to a race between you and the hackers. Either you use [penetration testing] or the hackers will do it for you.\u201dAccording to Gartner, enterprises considering these tools and services should expect substantial market and product consolidation. Acquisitions are likely among the major software development lifecycle (SDLC) platform providers and security vendors, Fieman says. Already the quality-assurance divisions of two heavyweights, Hewlett-Packard and IBM, have bought into the market (acquiring SPI Dynamics and Watchfire, respectively).Here is advice from CISOs and analysts on how to evaluate and use these tools and services. Key Decisions 1) Who\u2019s going to use it? Assigning responsibility for securing Web applications isn\u2019t always a straightforward task. It\u2019s a new concept for development groups and QA teams, and security groups are more accustomed to network issues than application issues. So who does the job? According to Fieman, it\u2019s awkward for security specialists to scan the application and forward the results to developers. But that\u2019s exactly what many companies do, at least until the developers accept the idea of using the tools.Phil Heneghan, chief information security officer at USAID, for instance, has shouldered the responsibility for Web application security, believing it\u2019s ultimately his job to secure the enterprise and that it\u2019s better to have someone other than the application creators assess its vulnerabilities. \u201cYou could end up with a rose-colored picture if the developer says, Don\u2019t worry; I checked it, and it\u2019s fine,\u201d he says.Similarly, Andre Hiotis, technology security officer at NAV Canada, purchased IBM's Rational AppScan [editor's note: corrected] more than a year ago and is only now putting it on developers\u2019 desktops to use themselves, at their request. If he\u2019d given it to them at the get-go, he says, they would have been overwhelmed by all the information it produced. As it is, his team has had time to learn the tool and can now provide assistance to the developers when they use it. Security staff is also better equipped to prioritize and edit the tool\u2019s voluminous reports and will continue to provide that service. \u201cIf the developers saw 100 things needed to be fixed, they couldn\u2019t judge which were high, medium or low risk,\u201d Hiotis says.2) Service or tool (or both)? You can buy the tool and dedicate resources to building a robust testing capability, or have a vendor scan your Web applications remotely, validate the findings and produce a focused report. Most leading vendors now offer both options, except WhiteHat, which offers only a service-based solution. \u201cMany companies wish to perform their own testing in-house, for control, management and privacy purposes, but there\u2019s a large and growing market for scanning services,\u201d Kelley says.And some organizations are choosing to use both. The manager of information security at a large healthcare organization (who declined to be identified), for instance, temporarily halted the use of WebInspect when he found he didn\u2019t have the staff resources to manage the volumes of data it produced. \u201cYou need human intelligence to eliminate the false positives and get a complete analysis of where the vulnerabilities lie,\u201d he says. He turned to WhiteHat for help interpreting the results and working with developers to fix problems.After a year of becoming accustomed to the service, he\u2019s now expanding the use of his original tool and is planning to take a three-tiered approach. Developers will test coding and compilations on the fly with WebInspect, and then security staff will run a second scan with that tool. On the third pass, they\u2019ll push the application out to the Internet and have WhiteHat run a test.3) How will you integrate? These tools operate best when they are integrated\u2014either natively or through an application programming interface (API)\u2014with other systems used by developers and the QA team. These include QA and testing tools, as well as con\u00adtent management, project management andscheduling tools, so the scan results can be tracked and fixed like any other code defect. They should also tightly integrate with SDLC platforms such as Microsoft Visual Studio, so that, ideally, developers could run a scan from their desktop, using an interface similar to their development tool\u2019s.It\u2019s also optimal for the tool or service to export results directly to a static source code scanning tool. That\u2019s because while Web application testing tools can tell you what kind of vulnerability you have, they don\u2019t pinpoint the exact location in the code where the problem lies. \u201cDetecting vulnerability is 50 percent of the job,\u201d Fieman says. \u201cYou have to close the loop.\u201dEvaluation CriteriaAccording to Gartner, there are almost no dramatic differences between vendors\u2019 scanning technology principles; differentiation lies among vendors\u2019 ability to do the following:Tightly integrate with software development and production processes and platforms.Manage and report across multiple deployed scanners.Scale to different size environments.Provide features and services beyond scanning, such as source code scanning; Sox, HIPAA and other compliance analyses; automatic vulnerability fixing; hosting services; training; assistance in process design; and consulting in the adoption of security into the SDLC process.Gartner adds the following technological criteria to consider:Vulnerability detection and corrective analysis. Vulnerabilities should be reported, and suggestions for correction should be made in a language that developers can understand. The scanner should identify the relevant webpage and URL where the vulnerability was detected. False positives must be low.Continuous and prompt update of the vulnerability database. Because new attacks appear over time, vendors must keep a database of all known vulnerabilities and promptly update it with new vulnerabilities as part of the standard maintenance contract. A metadata repository would help in analyzing vulnerabilities and remedies.Reporting and analysis. The tool should aid in classifying detected vulnerabilities and rating them according to their severity. In addition, detailed explanations of vulnerabilities, suggested solutions, and linkage to existing patches and patterns should be available. Reports should cater to application developers and security professionals of different levels.Ease of use by nonsecurity experts.Protocol support. Most scanners use only HTML and HTTP to probe Web-enabled applications. However, it broadens usability when other protocols are supported, such as SOAP, SNA, LU 6.2, RPC and RMI.The tool should support common Web server platforms, such as IIS and Apache, as well as hosted functionality in the form of ASP, JSP and ASP.NET. Dos and Don\u2019tsDO make sure your company is ready to make a real investment not just in the tool, but also in training, staffing and developing robust processes around finding and fixing vulnerabilities. \u201cThe main weakness I see is companies that feel they can take the product, point it at their applications and get the same wealth of information they could get if they did manual or highly assisted testing,\u201d Kelley says. \u201cYou have to educate your testers on how to test, and they need time to work with and configure and use some of the add-ons provided to assist the process.\u201dDON\u2019T expect developers to love the tool right away. Many developers have been blissfully ignorant of Web application security elements and are either embarrassed, insulted or just not interested in what these tools reveal. \u201cOne of the biggest pain points was getting people to accept the seriousness involved with this,\u201d says the security manager at the healthcare organization. It took his organization a year and a half to get developers to adopt the tools.DO realize the limitations of these tools. Some people want to believe that if the tools don\u2019t find a problem, they\u2019re home free, says Gary McGraw, CTO at security consultancy Cigital. \u201cBut the only thing it can tell you is you don\u2019t have these [specific] problems,\u201d he says. \u201cIf they had a list of all possible security problems ever in the history and future of the planet, that would be a great thing, but that\u2019s impossible.\u201d That\u2019s why McGraw famously called these tools \u201cbadness-ometers\u201d\u2014they can tell you when your code is bad, but they can\u2019t tell you that your code is lock-tight secure. Not that the tools don\u2019t have value, McGraw says; they do shorten the testing cycle considerably, but humans are often needed to validate that a problem exists.DON\u2019T think one tool will find every problem. At the healthcare organization, \u201cWe\u2019ve found vulnerabilities with SPI that WhiteHat didn\u2019t and vice versa,\u201d the manager says.DO realize that security is not a one-time event. Because Web applications are ever-changing, they must be tested continuously to ensure no new vulnerabilities have been introduced, Burton Group\u2019s Kelley says. Even OWASP continuously changes its top 10 guidelines. Heneghan scans his organization\u2019s Web applications once a month. The healthcare organization security manager warns it can take one or two days to crawl through all his firm\u2019s applications and produce an analysis.\u2002This article appeared in CSO Magazine as "Patching the Holes in Web Application Security". Mary Brandel is a freelance writer. Send feedback to email@example.com.