Security blunders 'dumber than dog snot'

At the 2010 USENIX Security Symposium, a vulnerability assessor reveals some of the more egregious examples of stupidity on the part of professionals who should know better.

WASHINGTON, D.C. -- Voltaire is famous for noting that the main problem with common sense is that it's not all that common. Proof of that abounds in the security industry, where people who should know better do idiotic things daily, according to Roger G. Johnston, a member of the vulnerability assessment team at Argonne National Laboratory.

At the 2010 USENIX Security Symposium today, he presented some surprising (or not) examples of what he has seen as a vulnerability assessor: security devices, systems and programs with little or no security -- or security thought -- built in. There are the well-designed security products foolishly configured by those who buy them, thus causing more vulnerability than before the devices were installed.

Then there are the badly-thought-out security rules and security programs laden in security theater, lacking muscle and teeth. In fact, some policies only make some employees disgruntled because they are treated like enemies and children. In turn, the company risks turning them into malicious insiders.

Also see "Ouch! Security pros' worst mistakes

Johnston described three common problems: People forgetting to lock the door, people too stupid to be helped and -- worst of all -- intelligent people who don't exploit their abilities for the betterment of security. Enter what he calls the dog snot model of security-- where intelligence and common sense exist but are not used.

He came up with the term by watching his dogs, who often crash themselves against the picture window facing the yard when they want to go chase a squirrel. Hence, the windows are covered in dog snot. Call it the classic banging-the-head-against-the-wall approach.

"In the interest of following the American way, we will do nothing until there's an incident. Then we will massively overreact," he said, running through several examples of this in both the physical and cyber security cultures:

  • Security cameras that mostly fail to prevent crime because they have poor resolution that cause security personnel to miss things.
  • Electronic voting machines easily tampered with on the voter's end. Voters can easily remove the panel with candidate names and can then tamper with the electronics. Just swap four wires and you can switch the votes for two candidates, Johnston said. You can also use a radio frequency device to turn the cheating on and off from a half-mile away. It's also stupidly easy to pick the locks on the voting machines. Johnston showed a video of a colleague doing just that.
  • Overlooked insider threats that are usually sparked by bad HR policies. "There are things you can do about disgruntlement but instead companies feed the problem," Johnston said. "We've seen phony or nonexistent grievance and compliance resolution procedures, no constraints on bully bosses, failure to manage expectations, watching for sudden behavioral changes in employees & it all contributes to the problem."
  • Failing to see if employees and contractors can be bribed by offering them money to do bad things.
  • Assuming that low-level employees are harmless and never asking what they are up to.

"You should try to bribe employees and contractors," he said. "If they're honest and refuse the bribe, let them keep the money and hail them publicly for their honesty and integrity."

Also see "Five mistakes security pros would make again

Other blunders come from the notion that defense-in-depth is always a good thing. In fact, he warned, it can be a recipe for disaster when the various layers are badly stacked and monitored. "When we ask someone what their strategy is and they say layered, we know they're in trouble," he said. "Multiple layers of bad security don't add up to good security. Too much complexity causes problems."

The bad seeds of layered security include the complacency companies get simply by believing that their security layers are ironclad. One reason they are often not ironclad is that they are configured improperly.

Johnston has also found that engineers often don't get security. They're more focused on the customer, not the bad guy, and focus on user friendliness, adding new features that become new attack vectors.

There are design blunders: failing to close backdoors and diagnostics used during development, not setting the microprocessor security bit and not masking passwords and critical data. Often, security devices lack tamper-resistant or tamper-indicating enclosures or seals, as well as sensors to detect it. Thus, it's easy for someone to physically tamper with the device.

Why so much stupidity? Johnston had a few observations:

  • Committees and bureaucrats are often in charge and don't know what they're doing
  • Security is often approached like taking out the garbage -- the slam-dunk mentality.
  • There's too much blind faith in procedures and authorities.
  • Security theater is always easier.

Johnston offered a number of options to reduce blunders. For starters, companies should employ hacker-oriented people who know how to think like the bad guys, which will lead to more problems being uncovered. Next, it's important to remember that regular employees are part of your security, not the enemy.

Finally, technology is never a cure-all.

Insider: How a good CSO confronts inevitable bad news
Join the discussion
Be the first to comment on this article. Our Commenting Policies