Who should be on an insider risk team?

Catching an insider taking confidential information doesn't happen by chance.

1 2 Page 2
Page 2 of 2

Matias Brutti, a hacker at Okta, said perhaps the least accounted for, but potentially most important role on the insider risk team is the red team, followed by the obvious incident response and monitoring team. A red team is responsible for playing the role of the insider threat while the blue team tries to monitor and defend against those threats. The red team members proactively try to find ways to exfiltrate data, obtain personally identifiable information (PII) and access unauthorized services outside of someone’s scope. “They do this to ultimately help build realistic processes and procedures that will prevent a real attack in the future,” he said.

Exabeam CEO and Co-founder Nir Polak takes a slightly different view on who encompasses the insider risk team. “We are seeing more companies create insider risk organizations, and often these do not report into the IT security organization. These teams typically include people with police or investigation backgrounds instead of IT skills. This can make sense, as insider attacks are often not technology-based. They use valid credentials with valid access to sensitive information, but with a goal of using that information for invalid purposes. In this environment, forensic and detective skills will be very valuable,” he said.

Team duties

To put it simply, Polak said these teams are put together to create policy that minimizes risk, then select solutions that help implement the policy. They also use the same tools to monitor policy. For example, the policy might say that “employees shouldn’t have more access to confidential data than their current job requires,” and then the team implements a program to regularly review access.

“You’d be surprised how often employees accumulate access rights and then never give them up when they move to new projects,” Polak said.

Kris Lovejoy, CEO of BluVector, has found that where employees and contractors (part of the extended team) have clarity on “why” security is important, it’s much easier to assure they learn and adhere to the “how”.  “That said, the flip side of the insider risk team function is to assure policies and processes exist which enable a speedy response when 'rules' are broken.”

Jo-Ann Smith, director of Technology Risk Management and Risk Privacy at Absolute, said the first task for an insider threat team should be to define the various types of risks that exist within each level of their organization. Next, the team should prioritize the risks and implement solutions, which include setting guidelines for interactions with their own direct reports, providing direction on the type of baseline controls that will be required to reduce or mitigate risk, and establishing baseline standards that will enable the company to measure existing risk levels and report on them.

The team should conduct thorough, regular vetting of employees and vendors. This is especially important for personnel who may have exhibited strange behavior or have formal complaints, as well as those in positions with privileged access to critical assets and sensitive information, Burke said.

Westby said a team lead should be responsible for establishing and managing the overall effort and reporting to any security, board or audit committees on insider risk. The core team should have an operational lead that is responsible for executing monitoring, testing, incident response and remediation activities. A program architect/designer should lead the development of policies, controls, processes and selection of tools for the program. An analyst should work with team members and their organizations to execute the risk assessment process and reporting. Finally, an oversight lead would help measure performance and ensure compliance.

Chris Gray, practice leader and vice president of enterprise risk and compliance at Optiv Security, said “Monitor, monitor, monitor. I cannot stress enough how important threat identification is and rapid, effective identification stems from good monitoring processes. If you don't know what right looks like, how can you identify wrong?”

Dottie Schindlinger, governance technology evangelist at Diligent, said the policy should institute a program of training, testing and auditing of the systems/controls. The policy should lead to a procedure that identifies the specific systems and controls in place to help identify, mitigate and manage potential insider risks. The procedure should also explain the process for anyone within the company to report potential insider risks, and the protections available for “whistleblowers.” Ideally, the policy and its associated procedure should be reviewed, tested and audited at least annually.

“Compliance should own the process, requirements and procedures, as they are the gatekeepers of these areas,” Brutti said. He added that all teams should routinely meet and discuss new scenarios, and keep a matrix that allows the company to map teams to access levels and data that can be reached. From this matrix, the teams can adjust, prioritize and create policies. They can also institute key segregation of duties to disperse the crucial functions of certain processes to more than one person or department.

“This reduces risk and provides understanding for how to monitor and where to invest and obtain a good return on investment when building new monitoring platforms and rules,” he said.

Mancini said the duties of an insider risk team will vary based upon what you consider insider risk. For Cylance, insider risk comes in several forms (disgruntled employee, spies, unwitting employee, contractor/vendor threat) each requiring different potential “duties” to mitigate risk; some may be intercepted with appropriate channels of continuous risk, maintaining procedural channels for grievance airing, sustained employee health/morale programs, appropriate executive messaging in relation to morale, training for managers to spot insider risk in their reports, and technical monitoring of assets and potential adverse activity initiated with starting privileges within the organization.

“The team mission would be to design, implement and provide oversight for controls to reduce risk based upon these different insider threat profiles. They would provide governance over technology solutions to ensure efficacy but also ensure that employee privacy is protected. They would design, implement, and test the necessary incident response programs customized to address the differences insider risk introduces,” he said.

Yossi Shenhav, co-founder of KomodoSec Consulting, said, “The first duty of an insider risk team is to do a thorough background search on all employees, old and new, to see if any red flags arise. Then, all employees should be made aware that there is constant, systematic monitoring and restriction of access to sensitive or financial data, so it will be absolutely clear that any improprieties will be intercepted and dealt with swiftly and severely. Lastly, since incidents will still occur by individuals who are intent on violating the law, a subgroup should serve as an incident response team backed by systemic forensics to block the attack and/or minimize the severity of the breach and apprehend the offenders.”

Send your comments, risky or otherwise, to our Facebook page.

Copyright © 2017 IDG Communications, Inc.

1 2 Page 2
Page 2 of 2
7 hot cybersecurity trends (and 2 going cold)