Pethia: InfoSec's Challenges, Changes

CERT-CC founding director Rich Pethia reflects on old IT security issues, and the ones CISOs are facing now

A lot has changed in the 20 years since Rich Pethia first took the reins as director of Carnegie Mellon University's Computer Emergency Response Team (CERT). CERT, which was initially launched as the first Internet security response organization has evolved over the years to focus more on research and training role. Pethia, a CSO Compass Award winner, spoke with CSO about how much IT security and vulnerabilities have changed in two decades.

CSO: CERT is celebrating its 20th anniversary this year. Tell me about those early days when you first started?

Rich Pethia: Back in 1988, when we started, the Internet was probably about total 200,000 systems and almost all of them were here in the US. Most were in university research labs, some in the government. A few were in the private sector, although commercial use was not yet something that was done with the Internet.

That year a graduate student at Cornell University let loose the first Internet worm, which over a period of ten or 12 hours clogged enough machines and started putting enough traffic on the network that the network was bogged down and useless for any activity. So researchers around the country that were responsible for really putting Internet together, along with their sponsors in DC, which at the time was the Defense Advanced Research Projects Agency (DARPA), sort of self-mobilized and understood they needed to capture this piece of malicious code. They reverse engineered it, they understood what vulnerabilities it was exploiting, and they got patches out to everybody. Following that event eventually the network was up and running.

There were a whole series of meetings in Washington, DC about what to do about these kinds of things. The thinking was that this was probably the first of what would be a long series of problems with the internet. So DARPA decided to establish the CERT coordination center and our initial charter was to be a response center for anyone on the Internet that had security problems.

In the early days we spent time getting to know the network service providers, getting to know the organizations that were connected to the internet, coming to understand what operating systems were being used, what protocols were being used. Over the course of the first year, we probably responded to only five or six incidents. Most of them were break-ins of one kind or another because people were just curious and trying to understand how far they could push their break-in techniques. In some cases there was malicious intent. But it was really a very low-level activity in the first year.

We also became a reporting center for security vulnerabilities in products. At the time, the predominant system was Unix so we got to know all of the Unix systems vendors and worked with them to get fixes out to the broad community.

By the 1995-96 timeframe, we were getting reports of upwards of 15,000 to 20,000 incidents a year and reports of 6,000 to 7,000 thousand new vulnerabilities reported each year.

How has CERT's role changed to what it is now?

We've really moved away from response over the years. Another piece of our charter was to help other organizations form response teams. And so there are now over 200 CERT teams worldwide. A lot of them have stepped forward to pick up the bulk of the load.

Also, about five years ago, when DHS was created, the government decided to create the US CERT, which is the response team for the US. We support them, but they are the front line organization for internet response in the US. We are sort of one step removed from day-to-day response activity.

We still take the lead activity with respect to working with vendors to get security vulnerabilities resolved. A lot of those vendor organizations early on had very immature processes for handling vulnerabilities. Obviously that is no longer the case. The Oracles and the Microsofts and Ciscos of world have very mature processes. But occasionally there will be a very significant vulnerability that affects many different vendors. Our role there is to coordinate response across those vendors. The idea is that all go out with updates at the same time and not cause one part of the community to be vulnerable because one vendor went out with corrections ahead of time, announcing the problem to the whole world.

Vendors contact us in good faith. That is what our original sponsor, DARPA, was quick to tell us. We have no authority. No mandates or requirements that anyone work with us. But over time I think people have come to recognize this kind of coordinated activity helps the community most and have come to respect the need for these kinds of practices.

Is CERT's role more research-based now? What kinds of issues are you dealing with these days?

We have expanded our research program pretty dramatically over the years. We now focus probably two-thirds of our activity on work aimed at preventing incidents and one -third on helping people to respond.

We move forward by looking for gaps in commercial practice. A significant piece of our work in the last few years has been on cyber forensics. Helping build forensic methods and tools that help law enforcement use do better job of investigating. That means not just crimes against computers, but also computers used to commit crimes. One of the problems that the community faces is that there simply aren't enough trained investigators to deal with these problems given the explosion of computer crime and computer-assisted crime we have seen in the world. So we are helping to fill that gap by building customized training for these organizations which we make available through a virtualized-training environment. Field agents can quickly come up to speed on understanding the steps to take to properly secure computers and collect information in a way that preserves the chain of evidence so it can eventually stand up in a court.

We began to understand the day-to-day problems investigators were having as the technology changed over time. If you look at a standard tool kit for someone who goes out in the field to investigate a computer crime, it's a little stand alone device that you can plug into a PC and collect data and do some online analysis and offline analysis. Very quickly we discovered these things weren't standing up to the new standard of the massive amounts of memory and storage that are on today's computer system. It's nothing today to walk into an office and find a PC with a terabyte of data hanging off the side. So building prototype devices that demonstrate how you can better use technology to deal with these kinds of problems has been part of our agenda.

Another gap we see in organizations if the difficulty they have implementing effective security practices in their organization. Sometimes organizations are driven by regulations. In the federal government, for example, federal civil agencies have to comply with the FISMA (Federal Information Security Management Act) regulations. In the private sectors they seem to be driven by standards of some kind.

They seem to have problems in two areas: The first is understanding how to use these practices in a way that reduces risk. In too many cases we see security programs that are all sort of check the box compliant. The other thing is we see people struggle to implement security practices in a way that integrates without other IT management and risk management and continuity practices in the organization. So we recently released something we call the resiliency engineering framework, which is a model that integrates just that in way that organizations can understand if they are making real progress in risk management and how to measure that progress over time.

What kind of operational benchmarks does that framework suggest with regard to security?

The struggle with benchmarks is that organization, even if they are in the same business area, very often have enough of a difference in the way they are organized and have architected their information technology that a one-size-fits-all benchmark is going to be problematic.

In all cases security is content specific. The practices and technologies used in an organization have to be tailored to an organization's management structure and the way its technology is structured. So I think the approach that does make sense is to all an organization to look at itself over time and see how it improves.

Application security is becoming an area of focus now in security. Are we at the tipping point yet?

I think we are getting close to a tipping point. Organizations are getting better at securing their networks and operating systems. When you think about how difficult is it for bad guy to accomplish what he wants to accomplish, attacking operating systems is going to continue to get harder. I see the whole field of applications as the next line of attacks because that is where the least attention has been paid so far. So people will be under pressure to understand we need to do a better job there.

But there is a lot more promise in application security. If you think about general purpose security tools that detect or try to detect anomalous conditions of one kind or another, it's easier to do if you understand the context of the application. For something to try and sit at OS level and try and detect a problem is one thing. If something sits within an application, trying to detect data inconsistencies is an easier problem to solve.

Copyright © 2009 IDG Communications, Inc.

7 hot cybersecurity trends (and 2 going cold)