Evaluating security software is not a walk in the park. It's more like a trip to the circus: Vendors promise the world -- 100 percent accuracy! 100 percent protection! -- for a big one-time fee plus an annual subscription. The one thing you can be sure of is that the results won't measure up to the promise.If it's your job to choose security software, you have my sympathies. To make things a little easier, I offer my seven-point plan for evaluating solutions. The products have changed radically during the two decades I've been using this framework, but it works. That's why I still swear by it.[ 6 lessons learned about the scariest security threats | It's time to rethink security. Two former CIOs show you how to rethink your security strategy for today's world. Bonus: Available in PDF and e-book versions. | Stay up to date on the latest security developments with InfoWorld's Security Central newsletter. ]1. Write down your goals First, write down the tactical goal you want to meet with a new product. This might seem ridiculously easy at first, but it becomes less so as you get more specific. For example, you might say you "want antivirus software," but at the next level of detail, what you really want is an antimalware program that runs on the server and on clients, with real-time, on-demand, and scheduled scanning. Does the antimalware program also need to offer a host-based firewall, include antiphishing, or send alerts to your help desk?2. Create a feature list Here's a sample list of options: What clients and servers must the product protect? What types of servers (Windows, Linux, and so on) must the management and production software run on? You might also want to indicate which database (Microsoft SQL Server, Oracle, MySQL, Hadoop) and Web server (Apache, IIS) technologies you'll accept. Do you want the product or service to protect mobile computers, tablets, and smartphones? Must the protected clients already be managed by your company? Must the clients reside on the corporate network, or can they be located across the Internet? If you put a lot of thought into your feature list, it can easily reach several dozen requirements.Be sure to indicate which features are deal breakers vs. nice-to-haves. Have the selection stakeholders (hopefully including both management and end-users) review and approve the feature set. Get everyone to agree to the deal breakers.3. Do your research I'm a big fan of software reviews. Rarely do I read one and fail to come up with an issue or feature I would have missed had I tested the software before reading the review. Often, flaws noted in review articles will be fixed by the time you personally review the software, but knowing what was problematic and how the vendor fixed it can help in your decision.4. Create a test environment I highly recommend using an isolated test environment before performing even limited testing in your production environment. The biggest decision to make is how much effort you want to put into creating a test environment that accurately mimics your production environment.For example, next week I'm doing a demo of a set of products that in the production environment will require seven different servers. Do you want to mimic all seven servers in the test environment, or is it acceptable to merge roles? If you can, you always want to mimic the configuration that will go into the production environment, but you can sometimes get away with fewer machines (or VMs) without sacrificing accuracy.Think about naming conventions in the test environment. I usually recommend that the test environment names should be at least slightly different than those of the production environment. Why? Because many times the supposedly "isolated" test environment ends up having one or more connections into the production environment; if you use the same names, you could cause real operational issues.Write down how you configured your test environment with enough detail so that anyone relying on your results could re-create it. If you're using virtual machines, now would be a good time to take a VM snapshot. That way if you're testing multiple products for the same solution, you can make sure you start with a clean slate for each test. I also usually take another snapshot just after the computer security software is installed, so you can go back to the original install state if you decide to change configuration options and test again. Create at least one test image for each platform the product must support.\t\t\t\t\t\t\t\t\t\t\t\t5. Structure and perform rigorous tests Once you get a test environment locked down and the product in hand, it's time to begin the evaluation. You can get some pointers by reviewing past InfoWorld Test Center reviews. Our reviews tend to break up product evaluation into a handful of categories:Installation. You'll be evaluating installation on both the client and server side. How many different install methods are supported? Can you push out installs within the product itself or do you have to accomplish that using another method? What services must be running and what firewall ports must be open in order to install the software remotely? If you test installs, how many failed? Be sure to install at least once for each desired platform and form factor.Configuration. Note how hard the product was to configure during or after the install. The best products walk you through a series of wizards that help you make the best choices. How hard is it to change settings afterward? How long before the changes take effect? Does the product offer both agent and agentless installations?Management. How do you connect to the management console? Hopefully it's over a HTTPS or other secured connection. Does the management console allow you to create different access control views for different types of administrators? What remote clients does it support?Logging and alerting. Every computer security product should do lots of logging. What format are the logs in? Can you manipulate the log format, data collected, and export to other formats? Can the logs be saved to an external database, emailed to people, and restarted or recycled on a scheduled basis? How many different ways can alerts be sent (email, SMS, pager, network message)? Does it "message throttle" alerts so that dozens or more aren't sent for a single related event? Does the product allow you create custom alerts or to ignore alerts you don't care about?Reporting. Most products shine or show their weakness in their reporting. You want a product with lots of built-in, canned reports. The product should allow customization of current reports and easy creation of new ones. Can reports be extracted; if so, to what formats? Can reports be scheduled on a periodic basis and automatically emailed to interested stakeholders?Performance. Performance can be the hardest measurement to take in a test environment, since you won't have the same scale as the production environment. But don't simply rely on the vendor's claims. Interview other vendor clients of similar size to hear what they have to say about the product and its performance.Make sure you get in writing what hardware specifications will get you peak performance. Many buyers put a clause in their vendor contract that requires a certain minimum guaranteed level of performance, both on server and client side, or they get their money back.Support. Most vendors claim to have 24\/7 support. But does calling the support number result in having to navigate an endless phone tree? Are the support engineers well trained? How many support calls are you allowed each year? What does it take to get escalated to advanced tiers of support? What would it take to get the vendor back onsite to resolve an operational issue? Again, don't rely on vendor claims. Talk to other customers about the same size as your company. Bigger customers often spend more money with the vendor, which often results in "special" care that a smaller customer may not receive.6. Get references -- and call them Ask the vendor for a list of customer references, hopefully of the same size and in the same industry as your company. Although customers are nearly always selected for their undying love of the vendor, I find you can coax actual experiences from them.My favorite trick is to wait until the end of the conversation, after they've told you nothing but great things about the product. Then close by asking, "What do you wish the product did or did better?" Often, customers will proceed to share their real concerns. I've learned about many deal-breaking problems this way.7. Test in production Finally, test your prospective buy in your production environment before committing to a purchase. Nothing surfaces more bugs than moving from test to production.My last testing hint: Don't be awed by a product just because it's an appliance. An appliance is nothing but harder-to-update software.Go forth and good luck!This story, "7 steps to choosing security software," was originally published at InfoWorld.com. Keep up on the latest developments in network security and read more of Roger Grimes's Security Adviser blog at InfoWorld.com. For the latest business technology news, follow InfoWorld.com on Twitter.