Do you remember when "on-call" meant carrying a pager? Batch routines signaled system alerts and other problems through an automated page.
Around that time, I recall a conversation with a client convinced they would be better with a solution that provided (near) "real-time" alerting. My first thought was, "how quickly can you respond?" Instead, I asked, “what does the person with the pager do?”
The client executive explained that the on-call person would hear the page, wake from their slumber, dial-in remotely (yes, dial in) and address the concern. When I asked the person with the pager, what they actually did, the truth came out: they shut the pager off before going to sleep and checked it when they woke up in the morning.
Turns out the system generated so many false positives that the signal was lost in the noise. The team chose to simply deal with the alerts in the morning, if they dealt with them at all.
Over a decade later, has anything changed?
Apparently Neiman Marcus experienced 60,000 alerts during their latest breach. Last week the big news was that Target was flooded with alerts as the breach was happening.In both cases, the alerts failed to generate proper action. In hindsight, some conclude both teams missed an opportunity. The discovery of alerts leads some to suggest the breaches could have been prevented.
Is the analysis fair? Is the focus on the presence of alerts accurate?
Relying on a tool (or tools) for alerts is useless if it generates too much noise and not enough signal. Too many alerts without the proper context fail to guide the right response. Without more details on Target, it’s possible the alerts were too frequent and too noisy to make a difference.
What happened to Target is either happening to you now, or likely to. Unless we make some changes.
The challenge: how do I share information?
Brian Foster, CTO of Damballa, shared some insights about Target last week. He agreed to share additional insights on how to think about the challenge differently and frame the conversation for change.
Foster explained, "One common issue IT teams have is there is no easy way to connect data from all products, even from the same vendors. The alerts still need to be correlated before there is actionable intelligence to consume. This has created a gap between the customers’ prevention tools and the human’s ability to respond."
Notably, this challenge gave rise to the SIEM market. The idea is to take alerts and other information and combine them in one database. This allows correlation and other analysis. Foster points out that the reliance on SIEM creates two interesting challenges:
- Prevention is based on detection bias: As Foster explains, "Prevention products have a detection bias because they all share the same threat intelligence but they cannot log what cannot see."
- Too much data: When everything is a priority, nothing is. Similarly, when everything is an alert, nothing rises above. Foster likens it to "finding the needle in the stack of needles."
As a result, the reliance on SIEM often means more manual effort (and more people).
The solution: provide context and evidence to guide action
Foster suggests framing the discussion in terms of a criminal court case. One piece of evidence is not enough to convict anyone of a crime. You need evidence from lots of sources to convict beyond a reasonable doubt.
"One alert doesn’t mean anything by itself nor does a single alert reduce the need for human intervention. You need a tool that automatically corroborates evidence without the need for people to do it, since people can’t be scaled to keep up with the volume of today’s advanced threats."
The question we need to ask: how do we apply intelligence?
Foster explained the solution includes using machine learning and people trained to think differently than security professionals.
"We have nine data scientists on the team. They have access to unfiltered network data. We don’t filter anything out. You never know what a data a scientist can use to solve real information security problems."
As a result, Foster explained that Damballa has developed eight different methods to provide specific information, context, and corroborating evidence that guides action.
"When it comes to security, you’ve got to apply intelligence to alerts. You’ve got a specific security problem. What action can you take from a noisy alert that provides no context?"
In other words, focus on providing more signal, less noise. Ultimately, it also means less people tied up on alerts and able to focus on advancing the security program.
The road ahead: two steps we need to take, collectively
Suggesting we need more people is a cop-out. To fix the perceived shortage requires an investment in solutions that increase our capabilities and teach others (read more here).
Two steps get the process started:
- Vendors (and solution providers): focus on making it easier for people to do their jobs. Adopt and comply with industry standards to build solutions that ease the process of sharing information.
- Enterprise leaders: structure the “client requirement” that the the solution provides signal over noise. Better, ask, simply, “how is this solution giving my people less work while increasing my security posture?”
As the Coke breach revealed, process and tools only matter if they are used. Each solution, each action that generates more alerts likely generates more information and more work. It also creates more opportunity for mistakes.
There is no silver bullet.
Maybe we can prevent what happened to Target; but that likely means changing the way we use detection. The goal needs to focus on better ways for humans to share and act on information.