You hire elite hackers to break your defenses and expose vulnerabilities -- not to be distracted by the pursuit of obscure flaws The most fun I’ve had as a security guy was getting paid to penetration-test companies and websites. It’s like getting paid to be a gamer. You earn a fat paycheck to hang out with friends and hack away without fear of being arrested.Most large companies today have multiple teams of professional pen testers, often both internal and external, trying to hack their systems. The most elite of these are called “red teams,” a military phrase used to describe any friendly group directed to think and plan like an independent enemy. The idea is that independent thinkers might find holes in your defenses that other people inside might not find.Through the years I’ve been part of some great red teams — and heard countless stories of how red teams not only broke in, but did so discretely, without setting off any alarms. Personally, every red team I’ve ever been a member of over the last 20 years has taken no more than three hours to break in without social engineering. If social engineering was allowed, it usually took less than an hour.Each person on a red team usually has his or her favorite go-to techniques. Some members attack at the network layer, others attack only websites or databases, others use particular tools or languages. A good red team can sniff out hidden weaknesses — and share them with the defenders to prepare against real-world attackers. But a good red team should mimic what adversaries are likely to do. In many cases, red teams have drifted away from trying what real-life adversaries would likely attempt — and focus instead on techniques that you’d be unlikely to see. These techniques often end up taking over the whole system or network. They’re often unrealistic and overly “sexy.”Unfortunately, management usually hears that the crown jewels of the system have been compromised and starts mandating that defenders start closing holes — without considering whether those holes would be likely to be exploited in the real world. This happens in military exercises as well. One of my favorite red-team analysis papers was written by U.S. Army Major David F. Longbine in his paper “Red Teaming: Past and Present.” Major Longbine examines examples of both good and bad red-teaming. He sums it all up in this telling paragraph:These errors consist of over-reliance on technology … failing to adapt to battlefield developments, misreading the adversary … Most of these “errors” are essentially failures or misguided attempts to apply the red teaming core concepts of incorporating alternative analysis and alternative perspective into decision-making. A fundamental precondition of the red team core concepts is that they must exhibit some degree of realism and accuracy.I’m seeing more and more red teams where success is measured solely by their ability to break in — without an assessment of whether the hack successfully mimics techniques hackers would be likely to employ. Sure, any hole that allows access to critical assets should be addressed. But resources should be concentrated on closing likely vulnerabilities, not obscure ones. Security is all about assessing risk — and good defense always focuses on the most imminent threats.The greatest value a red team can deliver is to break into your environment while mimicking real-world attackers. That can show you the continued gaps and weaknesses to your defenses. There’s particular value if red teams break in through holes you thought were closely defended, especially if the intrusions gain success without the defending team receiving alerts and notifications.Admittedly, I’m the pot calling the kettle black. When I think back to my days red-teaming, I didn’t care how I broke in, only that I broke in. Heck, the more obscure the method I used, the more I liked it, and the more the customer seemed threatened.There’s a common saying in the computer world: “Security by obscurity is no security!” I think that’s overapplied; attacking by obscure methods produces less value than it could. The best red teams look at the field of battle and mimic the most likely adversaries and techniques. Managers and defenders should require it. Related content analysis The 5 types of cyber attack you're most likely to face Don't be distracted by the exploit of the week. Invest your time and money defending against the threats you're apt to confront By Roger Grimes Aug 21, 2017 7 mins Phishing Malware Social Engineering analysis 'Jump boxes' and SAWs improve security, if you set them up right Organizations consistently and reliably using one or both of these approaches have far less risk than those that do not. By Roger Grimes Jul 26, 2017 13 mins Authentication Access Control Data and Information Security analysis 4 do's and don'ts for safer holiday computing It's the season for scams, hacks, and malware attacks. But contrary to what you've heard, you can avoid being a victim pretty easily By Roger Grimes Dec 01, 2015 4 mins Phishing Malware Patch Management Software analysis To catch a thief: Cyber sleuth edition Several bizarre coincidences led to the pursuit of a suspected fraudster in a shopping mall. It did not end well By Roger Grimes Nov 24, 2015 8 mins Cybercrime Data and Information Security Security Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe