Who should be on an insider risk team?

Catching an insider taking confidential information doesn't happen by chance.

eliminate insider threats 1
Credit: Thinkstock

Left to chance, unless you happen to bump into someone leaving the building with a box full of documents, you might never catch an insider red-handed. That is where an insider risk team comes in — group of employees from various departments who have created policies that create a system to notice if those confidential items have left the building.

“Insider risk is a real cybersecurity challenge. When a security professional or executive gets that call that there’s suspicious activity — and it looks like it’s someone on the inside who turned rogue — the organization needs to have the right policies and playbooks, technologies, and right team ready to go,” said Rinki Sethi, senior director of information security at Palo Alto Networks.

Steve Mancini, senior director of information security at Cylance, takes the disgruntled employee's point of view, indicating that they need to be provided outlets and recourse for their grievances before miscreant actions occur. “Fellow employees and managers need to be trained to spot the signs of disgruntled employees and given channels to report concerns in a manner that does not judge the potentially disgruntled employee, but instead put the right people in their path to help them resolve whatever grievance they have before it escalates.”

But not all companies are that advanced in spotting what an angry employee might do in retaliation. Policies would cover those obvious situations of an employee making an inordinate amount of photocopies or an alert that notices a USB drive is being plugged into a computer, but it gets tricky dealing with those scenarios that are not out in the open for all to see. It is the insider risk team that must come up with every hypothetical scenario in order to stay ahead of that disgruntled employee who only wants to fulfill a vendetta.

“Insider risk tends to happen less frequently than external threats, but the negative impact can be tenfold. Having the right insider risk team with risk management expertise is a must to assess the situation, pinpoint the culprit and execute your counterattack plan,” Sethi said.

Who should be on this team?

Many security experts made it clear that watching for signs of an insider threat is everyone’s responsibility. But in terms of the team’s makeup, it should be representative of the entire company.

The team should include the technical IT and Security teams, as well as non-technical stakeholders such as members of the C-suite, the legal counsel and human resources, said Veriato’s CSO David Green.

“The latter three will likely be unfamiliar with the fact that traditional security solutions don’t always work to prevent insider threats because, first, they are largely focused on perimeter security, and, second, they aren’t intended to identify or prevent problems stemming from insiders who are authorized to access sensitive data or systems,” he said. “But these departments should come together to discuss the various challenges associated with insider threats and establish policies and procedures to prevent and detect them while protecting employee privacy.”

Here’s what each department should bring to the table:

C-Suite: A member of the executive team should be present because you’ll need buy-in from the executive team to ensure the other departments represented on the insider risk team have the authority to establish a risk-based monitoring program and sign off on an Acceptable Use Policy (if one isn’t already established); set boundaries of what’s acceptable behavior and what’s not; and tie the plan to the company’s strategic objectives and help outline a security policy.

Legal: The legal team should be present to ensure all employee/user monitoring activities meet any local, state and federal laws. They should also help define what is permissible to monitor, such as email and instant messages, the web sites employees visit, online apps they use or any content they download or print. Recording employees as they log into their bank accounts online could be a legal risk for the company if something happened to the employee’s account. Also, since IT might not be permitted to review the activity of higher-level employees, legal will work with the security team to determine which roles within the organization can review which sets of activity.

Human Resources: HR can help create the processes necessary to ensure there is a warranted and documented need for any monitoring, and that the security team is made aware of these issues without breaking any privacy laws. For example, they might be aware of an employee leaving (a potential risk) or an employee’s personal or financial issues that might make them high-risk and worth investigating. The HR team (or any of the department) would communicate this threat through the pre-determined risk level of the position, not the name of the individual employee. 

IT / Security: IT – or whomever will be involved in both evaluating possible technology solutions and implementing the selected solution, will provide the other non-technical team members with context around which users have access to what sensitive data, as well as what’s possible when it comes to monitoring activity – all of which will be invaluable when putting the planning and preparation output of this team into practice. Technologies such as user behavior analytics, for example, look at patterns of behavior, and do not require inspection of the content of an employee’s activity to deliver on its promise of detecting insider threats. User activity monitoring software lets you capture and review the specific actions of an employee’s activity, including their emails or texts, if needed. There are versions of both that enable you to configure the types of activity monitored to align to your organization’s goals, with privacy protections woven throughout to address HR concerns.

“The risk of malicious activity from the seemingly trusted insider is still an ongoing reality for organizations worldwide. IT can’t implement a full insider risk program on its own – or keep one working properly,” Green said.

Each organization needs to establish an “insider risk” team that specifically addresses the associated challenges  – from determining who has (or should have) access to confidential corporate and client data and what each positional “risk level” should be to what constitutes inappropriate user behavior, how their activity will be monitored and how the organization will communicate which behavior is acceptable and the ramifications for breaking "the rules,” he added.  

Scottie Cole, network and security administrator at AppRiver, said insider risk teams are vital to an organization’s security. However, insider risk teams don’t necessarily have to be dedicated, full-time positions, but rather a broad spectrum of positions to bring the most holistic security angle.

For an insider risk team to be successful it takes collaboration across the company, said Shawn Burke, Global CSO at Sungard Availability Services. Procurement for vendor due diligence, Human Resources for screening, internal communication and consequence protocols, and Risk Committee for overall response strategy. However, General Counsel and the Chief Compliance Officer are key stakeholders as insider monitoring must comply with a spate of new state and national privacy legislation.

Mancini said an effective insider risk team that will design controls, take action, provide governance, and investigate. “Governance and control are critical to an insider risk team, who will watch the watchers? Audit capabilities must be woven into the process.”

Kennet Westby, president and co-founder, at Coalfire Systems, says that the insider risk team should also include representatives from any other users/groups with elevated access and privilege, including any vendor management and third-party contracting teams. Others believe the team should include the CISO, CIO, and Risk and Compliance officers. 

Steven Grossman, vice president of strategy and enablement at Bay Dynamics, noted that everyone in an organization needs to play a role. “However the key core players must be comprised of multiple talents that understand user behavior, and the overall landscape of cyber risk. That includes the type and value of applications, hosts associated with those applications, and the vulnerability posture of those hosts and applications. Application security owners who have a deep business understanding of the value and security of the applications under their governance play an essential role on the team. They know whether a seemingly unusual behavior was indeed business justified,” he said.

First, the team should put together policies that allow appropriate access based on business needs, and looking at tools to safeguard against insider abuse. This entails providing the right level of visibility into insider access and possible deviations.

Not everyone agrees on who needs to be on this team though. It might just be semantics, but some experts believe the insider risk team’s main responsibility is to create policy and then the various teams are to follow them. Other experts see the team as a group that follows up on a minute-by-minute basis to find out where any abnormalities take them.

Hamesh Chawla, vice president of engineering at Zephyr, said insider risk teams should be consistently looking at reports and logs on a daily basis to understand what deviations are taking place, and address those deviations immediately with the group to implement a course of action. “These specialized teams should formulate a crisis plan to mitigate the damage should an insider attack occur and have concrete, appropriate actions against those abuses.”

Javvad Malik, security advocate at AlienVault, breaks down the duties into almost layers:

Line managers: A first line of defense, they know the employees best, are aware of what tasks they need to undertake, the information they need to access and their overall morale and well-being.

Asset owners: An accurate asset inventory needs to be compiled, the data classified, and owners identified. These asset owners should know what services and users require access to the assets, when downtime is scheduled, and any planned changes. In the event of any suspicious activity detected, the asset owner should be able to validate if it was malicious. 

Legal / HR: Whenever looking into potential insider fraud, it is essential to have legal and HR representation to ensure that no individual rights are being breached and that any investigations are undertaken in a legal manner.

Forensics: Similarly, forensics investigators may be needed in order to undertake detailed investigation. This could include taking forensic images of devices for legal purposes and to investigate malpractice. 

Analysts / SOC: The security operations center (SOC) is the heart of all threat detection within an organization. Working with the involved parties, assets can be identified and appropriate alerts configured. Similarly, behavioral analysis should be a core component of an SOC so they can detect any deviations from normal activity and behavior. They will usually kick off incident response processes by engaging the other responsible parties. 

A successful insider threat program needs access to data, which should include endpoint, proxy, search history, phone records, and physical access logs if available, said Chris Camacho, chief strategy officer at Flashpoint. “Being able to understand and ingest multiple sourced data/information is a critical part to enable accurate analysis of who might be at high risk for insider activity. Naturally, an employee’s motivation is a critical aspect of why malicious activity could occur and can range from ideology, financial needs and even collusion or extortion of an employee. Access and correlation of the right data sets is paramount but leveraging intelligence analysts, the human factor, is an important piece of the insider puzzle,” he said.  

An insider program can also leverage technology such as user behavior analytics (UBA) that would provide a head start to bringing all data together. “However, in order to make the most use of the tool, someone has to be able to filter out the noise.  Having access to data in one platform is a great start but filtering through events and noise is even more critical,” Camacho said, adding that knowing how to find anomalies or patterns that don't make sense is one key function to the beginning of a successful program.  

“In short, an insider program should be able to curate data points that reveal a toxic risk score of 'who' might be high concern for malicious activity,” Camacho said.

1 2 Page 1
How much is a data breach going to cost you?