Simon Crosby, from Bromium, talks about hacking regulation and legislation Credit: Thinkstock Simon Crosby, from Bromium, talks about hacking regulation and legislation with CSO in a series of topical discussions with industry leaders and experts. Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focused on disclosure and how pending regulation could impact it. Now, this second set of discussions will examine security research, security legislation, and the difficult decision of taking researchers to court. CSO encourages everyone to take part in the Hacked Opinions series. If you would like to participate, email Steve Ragan with your answers to the questions presented in this Q&A. The deadline is October 31, 2015. In addition, feel free to suggest topics for future consideration. What do you think is the biggest misconception lawmakers have when it comes to cybersecurity? Simon Crosby, CTO, Bromium (SC): Lawmakers are misled by vendors and expensive consultants to believe that it is not possible to secure the enterprise, that all IT systems are fundamentally vulnerable and attackers vastly sophisticated by comparison with their meager defenses. The lobbyists with the most influence to exert (read: cash) are those most invested in maintaining the status quo. There is a fatalistic attitude in security that "you've already been breached," but the reality is that there are many bright minds working in security that are more sophisticated than the malicious actors, we simply must adopt a paradigm shift that moves us away from reaction and toward proactive protection. What advice would you give to lawmakers considering legislation that would impact security research or development? SC: Security research is vital to national defense. It enables us to stay a step or two ahead of attackers by understanding ahead of a major breach, how to protect and fix vulnerable systems. Additionally, it is imperative that we not sacrifice our security in an effort to enable the ability to improve surveillance and data gathering. Any security solution that is built with a backdoor is fundamentally flawed and inherently insecure because it is only a matter of time until it is cracked. If you could add one line to existing or pending legislation, with a focus on research, hacking, or other related security topic, what would it be? SC: Ethical research conducted to understand the security of systems and software, that is specifically aimed at permitting vendors and users to better protect U.S. infrastructure shall be a protected activity. Now, given what you’ve said, why is this one line so important to you? SC: The research community has helped to protect us better than any vendor tool that aims to detect attackers. Secure infrastructure results from clear, open designs and implementation followed by independent third party research to establish and verify the security posture that results. It is only by shining a light on the dark spots of security that we can illuminate a solution. If we persecute and prosecute security researchers who discover security vulnerabilities, there is the risk that we incentivize them to keep their discoveries a secret, preventing the hardening of software and perhaps even driving the malicious use of vulnerabilities. Do you think a company should resort to legal threats or intimidation to prevent a researcher from giving a talk or publishing their work? Why, or why not? SC: All vendors have limitations and hate to have them publicly exposed. But instead of resorting to threats they should embrace research that is ethically conducted. Vendors that intimidate the research community with legal threats are actively discouraging improved security. And so-called security vendors that engage in these tactics are selling nothing more than a thin veneer of security. What types of data (attack data, threat intelligence, etc.) should organizations be sharing with the government? What should the government be sharing with the rest of us? SC: Anonymized threat data that contains information about the attack, the attackers methods, sites and tools should be shared. We should hide details of the attacked user and where relevant the attacked organization. Again, we must be diligent to prevent the government from using security as an excuse for surveillance since there are international economic implications for the technology industry if its reputation and trust is damaged. Related content news Amazon’s AWS Control Tower aims to help secure your data’s borders As digital compliance tasks and data sovereignty rules get ever more complicated, Amazon wants automation to help. By Jon Gold Nov 28, 2023 3 mins Regulation Regulation Government news North Korean hackers mix code from proven malware campaigns to avoid detection Threat actors are combining RustBucket loader with KandyKorn payload to effect an evasive and persistent RAT attack. By Shweta Sharma Nov 28, 2023 3 mins Malware feature How a digital design firm navigated its SOC 2 audit L+R's pursuit of SOC 2 certification was complicated by hardware inadequacies and its early adoption of AI, but a successful audit has provided security and business benefits. By Alex Levin Nov 28, 2023 11 mins Certifications Compliance news GE investigates alleged data breach into confidential projects: Report General Electric has confirmed that it has started an investigation into the data breach claims made by IntelBroker. By Shweta Sharma Nov 27, 2023 3 mins Data Breach Podcasts Videos Resources Events SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe