Hacked Opinions II

Hacked Opinions: The legalities of hacking – Jeff Schilling

Jeff Schilling, CSO of Armor talks about hacking regulation and legislation

usgovt supreme court
Credit: Thinkstock

Jeff Schilling, CSO of Armor, talks about hacking regulation and legislation with CSO in a series of topical discussions with industry leaders and experts.

hacked opinion small Thinkstock

Hacked Opinions is an ongoing series of Q&As with industry leaders and experts on a number of topics that impact the security community. The first set of discussions focused on disclosure and how pending regulation could impact it. Now, this second set of discussions will examine security research, security legislation, and the difficult decision of taking researchers to court.

CSO encourages everyone to take part in the Hacked Opinions series. If you would like to participate, email Steve Ragan with your answers to the questions presented in this Q&A. The deadline is October 31, 2015. In addition, feel free to suggest topics for future consideration.

What do you think is the biggest misconception lawmakers have when it comes to cybersecurity?

Jeff Schilling, Chief Security Officer, Armor (JS): I think the common misconception is that we don’t have enough security people and capabilities. The reality is our networks where never designed to be secured, so we are not dealing with a security problem, we are dealing with a poor design.

Gen Hayden, former Director of the CIA and NSA remarked in a speech last week in Dallas: “Security was not in the Statement of Work for ARPANet.” (The first TCP IP network created for military research).

Legislation needs to not only cover security standards and policies for security, but also for those companies who design appliances that create the Internet of Things. We still have a culture of application and operating system developers who don’t create secure code. We also have network vendors with flawed operating systems and hardware.

What advice would you give to lawmakers considering legislation that would impact security research or development?

JS: I think that legislation should also cover how we train our IT professionals. Right now security is a separate course that computer science majors can elect to take. This shouldn’t be an option. Can you imagine if we only trained our Civil Engineers to build a bridge that could carry cars, but not survive an earthquake?

If you could add one line to existing or pending legislation, with a focus on research, hacking, or other related security topic, what would it be?

JS: In order for a university or professional organization to be accredited in computer science or engineering, 25% of the course curriculum should be focused on security.

Now, given what you've said, why is this one line so important to you?

JS: Poor network design, poor engineering and poor software development has caused our current problems. Throwing more security resources at the problem will not make it go away.

Do you think a company should resort to legal threats or intimidation to prevent a researcher from giving a talk or publishing their work? Why, or why not?

JS: This really depends. The responsible action for researchers is to inform the organization first that they have found a problem. If they’ve already taken that action and the company has not fixed the problem, then I think it is the right thing to do to make the flaw public. The company would have no moral ground to intimidate or sue.

What types of data (attack data, threat intelligence, etc.) should organizations be sharing with the government? What should the government be sharing with the rest of us?

JS: The government would really like to see the technical data or indicators of compromise we have from cyberattacks on our organizations in what I call the “last tactical mile of a cyberattack.”

The government’s threat data is collected from law enforcement and foreign intelligence surveillance activities. This data is often collected “further upstream,” closer to the attacker’s origin. If they can correlate the “last tactical mile” indicators of compromise with their “upstream” data, that is a complete picture of the threat actor’s targets and activity.

What we need from government is the threat context and what I refer to as “the human terrain” of the attackers so we know what the attackers are after and where to focus our security operations.

Cybersecurity market research: Top 15 statistics for 2017