Jadee Hanson\u2019s security analysts are always on the lookout for risky behaviors, so it\u2019s not surprising that they spot their business-unit colleagues sometimes acting in concerning ways, such as publicly sharing a document that might contain sensitive data.When that happens, an analyst reaches out to the colleague to determine whether he or she violated any security rules and to confirm he or she understands the company\u2019s cybersecurity best practices.Hanson, the CISO and CIO of Code42, a cybersecurity software company, sees value in that outreach.She says it can catch and correct problematic behavior, provide an opportunity for security awareness training and identify a potential policy breach at an early stage.Yet she says she also saw that the number of incidents requiring security\u2019s attention was impacting the security team\u2019s workflow.For example, in 2021 the security team handled 235 incidents in which sensitive, administrator-level credentials were used. Each incident took approximately 10 minutes to address\u2014sometimes more, totaling some 20 hours of FTE time that year; moreover, each incident interrupted the security analysts\u2019 existing work and drew his or her attention away from their tasks.At the same time, employees sometimes felt uneasy, as though security was watching and monitoring their every move; that in turn led to some tense interactions and reinforced negative stereotypes about securing being the department of \u201cno.\u201dDetermined to find a better way, Hanson and her team turned to automation.They designed and deployed chatbots for insider risk security incident response. And they trained the bots using their empathetic investigation framework to ensure the bots (and by extension security) had positive engagements with their fellow employees. Code42Jadee Hanson,\u00a0CISO and CIO, Code42The goal: to use automation and chatbots to more efficiently handle incidents while also delivering positive touchpoints between the security team and employees and allowing security analysts to focus on more strategic cybersecurity work. The project earned Hanson and her team a 2022 CSO50 award for security innovation.\u201cIn security, I think we struggle quite a bit just keeping up, and that leads to a lot of us getting burned out as we try to keep up with alerts and all the other daily stuff,\u201d Hanson says. \u201cSo my team created robots for something that we do on a very consistent basis while keeping the same tone and culture that we want to carry into the organization.\u201dAdvancing the use of automationCode 42\u2019s security department was using automation to streamline tasks within its operations work prior to deploying the bots for insider incident response management, so Hanson says using the technology to connect with the company\u2019s own employees was a natural extension of its ongoing automation work.Yet Hanson wanted to retain that human touch with these chatbots. Hanson explains that she and her team wanted the bots to take on a collegial tone that would mimic the goodwill that her security workers seek to engender when working with their business-side colleagues.\u201cYou want to make sure those interactions between the bots and the employees have the same tone and culture that security is trying to bring to the organization. So I\u2019m super picky about the messaging, because I want to make sure it comes off with the right tone,\u201d Hanson says.\u201cThe words that a security team uses when they talk to employees are critical, and sometimes that is overlooked. But they\u2019re really, really important. I want my security team perceived as helpful, so we have to think about how we want the company to think about us, especially when we reach out in communications. We want them to say, \u2018What a nice message.\u2019\u201dHanson says attention to this topic pays dividends for security.\u201cFor a long time, security organizations have sort of blamed the end user, and we say \u2018no\u2019 a lot. I think we\u2019re well intended\u2014our job is to protect the organization, so we have to say no and we get angry when end users do something that they shouldn\u2019t do. But I don\u2019t know if we have taken on enough of the accountability. It\u2019s our responsibility to educate the rest of the organization on the right way to do something. It\u2019s also our responsibility to figure out how to say \u2018yes,\u2019\u201d she says.\u2018First assume positive intent\u2019To ensure a positive interaction between employees and the bots, Hanson\u2019s security team leaned on what they call an empathetic investigation framework, established from key insights from its own Incydr product.\u201cWhen approaching users during an insider risk investigation, security analysts should ensure interactions exercise tact, empathy, and caution,\u201d she explains. \u201cInvestigating this way is a shift from how traditional malware security investigations are performed. This new approach was developed over time by insider risk management analysts using Code42 Incydr. It is founded on the principle that employees are trusted experts who are enabled to do their jobs. You wouldn\u2019t treat a colleague the same way you\u2019d treat an external attacker. For a coworker, you\u2019d first assume they acted with positive intent\u2014perhaps they accidentally shared that document too broadly or maybe they didn\u2019t know our data ownership policy because they\u2019re still new.\u201dHanson says that the security team has found that their business-unit colleagues in fact are usually unaware that they had taken a risky action.\u201cA friendly message or phone call asking if the person meant to take that action often cleared things up and the action was corrected quickly,\u201d Hanson says. \u201cWe\u2019d then take the opportunity to remind them about a protocol, train, or share resource materials in the moment, which went a long way in making our company culture more security-aware in the long run.\u201dGiven how well that approach works, Hanson says she wanted the bots to act in the same empathetic way.\u201cThe [automated] message looks like it\u2019s coming from my insider risk analyst; even though the bot is doing the work, it uses his tone and people have this sense that he sent the note and he\u2019s really nice about it,\u201d she says.Hanson says automating insider risk incident response saves time for both security analysts and employees performing high-security activities and, through open and consistent dialogue, fosters the positive relationships between security and other employees.And, she adds, the automation work so far is fueling further automation within the security program.It\u2019s a successful journey that Hanson and her team are seeking to share: They\u2019re sharing information about the empathetic investigation framework as well as their technical playbook to be used specifically with Palo Alto Cortex XSOAR and Slack to encourage open source development and contributions back to the security community.