• United States



Senior Staff Writer

Social engineering: It’s time to patch the human

Apr 09, 20187 mins

You know the phrase. "Social engineering: Because there's no patch for human stupidity." But there absolutely is, says Jayson Street.

marionette social engineering
Credit: Thinkstock

Jayson Street, the DEF CON Groups Global Ambassador, and VP of InfoSec for SphereNY, has likely forgotten more about social engineering than some of us have learned over the years working in security.

That’s not fluff, he really does live for this stuff.

Our conversation with Street started passively, a simple question asking him about his conference plans this year.

As it turns out, Street has a training class this year at Black Hat in Las Vegas, along with April C. Wright, where the goal is teaching security teams to create human intrusion detection systems.

More often than not, hiring external awareness teams to do semi-annual or yearly engagements is the default posture for most companies. However, when you develop such programs internally, you’re keeping things organic and avoiding cookie-cutter programs.

The people running the in-house program, and those participating know your organization, and that can be a useful foundation to build on. So why is this important?

Humans are an interesting collection of security risks

One of my favorite stories I’ve written here at CSO centers on Street’s engagement at a bank. It was published in 2015 (you can read the whole thing here), but the long and short of the story is this; there’s a real risk in positive thinking and assuming the best of everyone.

At the time, Street had simply walked into a bank (in a country where he was clearly not a local), went behind the counter, and infected systems at will. He was on a job, paid to perform a physical penetration test at the bank, but no one assumed the foreigner was up to no good.

“Humans do not want to think about negative things happening to them,” Street told me when I interviewed him about the job. It goes against human nature to do so.

“If I can give them a reasonable explanation, besides the negative thing that sounds bad, they will believe the positive,” he said.

Today, Street is still doing social engagements for the security world, but he’s also on a mission to help defend organizations form people like himself.

You can teach old dogs new tricks

Awareness training seems easy, but it isn’t, because most humans don’t understand the impact they have on a security program – they just want to do the job they’re paid for. This is why staffers are viewed as the weakest link, and where we get the saying; “Social engineering: Because there’s no patch for human stupidity.”

Street has an answer for that phrase.

“There is a patch, no one wants to do it, because it’s difficult. But there is a patch, and that’s continuing education. Not one-off lessons. Not a one and done test that [employees] have to take once a year for compliance. But an actual earnest indoctrination in making them part of the security process, instead of a liability or a result of bad security. They should be part of the security process,” Street told Salted Hash.

In short, if we don’t patch the human – no matter how good the tech is – we’re still going to have problems.

“Everybody wants to build a Blinky Box, and build technology that intercepts and protects the human, instead of getting humans to be developed and educated enough to protect the technology. They’re not a liability, they’re an asset. [Humans] are the biggest intrusion detection system that you’re going to get.”

There’s an assumption within some circles that continuous awareness training is a too difficult of a battle to fight within a given organization. Depending on scope, it can be a resource drain on time and money. Yet, equally as difficult is the recovery from socially-based attacks when they could’ve been prevented.

The harsh truth though, is that information security really isn’t part of the average worker’s job, and even if there are some security elements, they’re an afterthought, not the core.

“We’re not making information security part of the user’s job, and if it’s not part of their job, then it’s not their concern and they don’t care about it,” Street said.

Technology alone isn’t going to save you

“We’re looking at [awareness training] as if technology is going to solve everything, and that it’s a technological problem. It is not a technological problem that we’re facing, we’re facing a human problem,” he added.

Another push against awareness training comes from the mindset that systems need to work and be secure in spite of the human. Or, if it gets to the point to where the human is the final arbitrator on a security decision, then you’ve already lost. But here again, Street says, this mindset is wrong. When people say such things, he sees it as proof that the system was built to fail.

“You should be building the system, understanding that the very first part of it is that your users, your executives, are part of your security team and properly trained to face these things. They should be your first line of defense, not your last line of defense. They should be active members of your team,” Street said.

The people need training, and it has to be ongoing

Some will argue that security isn’t part of their job, but Street counters with the fact that security isn’t part of a delivery driver’s job, yet we expect them to obey traffic laws, such as wearing seat belts or following speed limits. It’s the same thing with an employee, they’re expected to follow safe computing practices.

But before that happens, employees need to be educated, they have to be taught how to do that, Street says. It has to be ingrained on them that security is a requirement and failure to follow security policy will negatively impact their job, which is something all employees care about.

Yet, there is a balance required here. You have to build an awareness program that keeps employees safe, without installing a suffocating sense of fear within the company. The answer is gamification and open lines of communication.

We’ve talked about this on Salted Hash before. Train employees to report anything that looks suspicious, even if it’s harmless. Train employees to engage with IT or security teams without fear, offering explanations and encouragement for good examples and bad ones. Then reward employees for participating.

But the core of this program is to ensure that staffers aren’t punished for being victims, because humans will make mistakes. Overall, the goal is to increase detection and awareness while improving response.

“I’m not going to lie and say it’s an easy thing to do, and I’m not going to say it’s an easy thing to implement. I’m saying its what’s necessary and what has to be done. It’s not a question of how easy it is,” Street said.

“We’ve got to start treating our humans like an asset. Like an actual part of our information security team. We’ve got to make them understand that it’s better for their long-term goal of being employed to be part of it.”