Bill Boni and Ira Winkler on Insider Threats and the Death by 1,000 Cuts

Security veterans Boni and Winkler talk about protecting intellectual property from insiders and snoops.

From configuring the hardware to connecting all the stovepipes, security executives need to tune up both for light jabs and roundhouse rights. Executive Editor Derek Slater talked defensive strategy with Bill Boni and Ira Winkler.

Bill Boni is vice president and CISO of Motorola. Ira Winkler is chief security strategist for Hewlett-Packard. In separate interviews, Slater discussed with them their respective visions of what's needed to get the security practice in shape. Both advocated paying attention to the little things.

CSO: You've both mentioned "the death of a thousand cuts" as a description of what security faces today. What does that mean?

Ira Winkler: Let me give you a recent example. I was talking to somebody at a large Canadian railroad company. She said, "I'm trying to convince my boss of the need for computer security. And he has this attitude that, first of all, we're a railroad company, we're not that high-tech. And, on top of that, we're not an American company, so we're not a target that anybody really cares about."

In other words, the boss doesn't believe [his company is] going to be the target of a devastating attack. OK, let's accept that, because, quite frankly, I think all these claims of terrorism and all the FUD work against us anyway. Still, I asked if she was hit by Code Red? She said, Yes. Nimda? Yes. Slammer? Yes. Other viruses? Yes. I asked, "Do you have insiders doing things that cost you a lot of money?" She said, "Yes, we have a lot of incidents we have to investigate. We're a large company."

So I said, "Did you ever add up the costs from all of that?" She said, "No, but it would easily be in the tens of millions of dollars."

Bill [Boni] used the term "the death of a thousand cuts" a long time ago. There's a lot of little things that, when added up, would be devastating if it happened all at once. And if you would do the basic, simple things on an ongoing basis—to protect yourself against the small things that add up to a major loss in total—you'd also be preventing the mythical terrorist attacks and other large-scale events.

Bill Boni: The way I look at it is that most organizations don't have a framework for keeping track of loss, particularly intellectual property-related loss. As IP has become digital, you now face the possibility of it being misappropriated without having the loss detected. It doesn't become manifest until an engineer in your company realizes that your biggest competitors have what you were expecting to have, at the same timeand you thought you were a year ahead of them. Plus, they have lower price points because they didn't have to spend the money to develop it.

So you [should try to] capture and synthesize a significant portion of those loss events, using HR, the physical security groups and other branches of the company as sensing mechanisms.

A lot of talk right now in IS is about the software consoles that do event analysis and correlation. I'm talking about creating an analog of that at the corporate level that correlates the technical aspects of security with everything else—HR, legal, all these different areas. Now management can make better-informed decisions with data, not just anecdotes.

A lot of practitioners will take advantage of a breach to say, "Aha, see, we need to protect our IP." But the counterargument is, "This was a onetime event." But if you have a process in place that allows you to prove that, no, it happened three times in the last quarter alone....

The next important question is, What's the source [of the vulnerability]? Is it technology? A legal loophole? A cultural blind spot in employees or management?

Even if you know your intellectual property is leaking out, how do you make that connection between what's been lost and where the loophole is?

Boni: This is where you go back to the fundamentals of counterintelligence. Information security can make its best contributions when you use the whole suite of tools and techniques with a counterintelligence mind-set.

Another example. If someone is scanning the internal network, your internal intrusion detection system goes off, and typically somebody from IT calls the employee who's doing the scanning and says, "Stop doing that." And he replies, "Oh, I was just testing this thing for my college class on IT management. I won't do that again."

He offers you a plausible explanation, and that's the end of it. Throughout the history of IP theft, this is how it always goes. HR sees one thing, physical security sees the guy "accidentally" carrying out documents ("Oops...I didn't realize that got into my briefcase"), and the IT people see the scanning incident. But nobody puts them all together to realize it's the same guy!

With IP theft, you can't always determine that it was Professor Plum in the library with the lead pipe. But [by adopting] a counterintelligence mind-set you can identify gaps in your protection scheme. Sometimes it [really] is accidental; I've worked cases where they did high-level internal product announcements at a ritzy offsite and left copies of printouts lying around. Sometimes it's not accidental. People in other countries—Ira has seen this—send in "dummies" who get jobs in the payroll department, and [once] they're there for several months there's very good likelihood they'll be able to access valuable documents.

The protection mechanisms are too disjointed. Just as in infosec, we have challenges putting together the big picture. The challenge [in IP loss prevention] is how to pull together all those other sensory mechanisms: access cards, legal policies, areas where product models and mockups are done. You have to consider those as sensing devices or places where you can potentially detect behaviors. But they don't [usually] get correlated in any meaningful way in most organizations.

Winkler: It's hard to put a dollar figure on data or IP loss. When it happens and they talk about prosecuting hackers, they'll say I've lost millions of dollars to this. In fact, there was the recent case [involving] Lockheed Martin and Boeing where they were talking billions of dollars. However, I don't think Lockheed Martin took a billion-dollar loss on its balance sheet. Very rarely do they declare the loss in an accounting procedure. And if you don't do that, your executives aren't going to think, "We can protect ourselves against IP theft and save ourselves millions of dollars a year!"

So again, what security managers and CIOs should do is add up the little losses, which will add up to a big loss, and then put their security programs in place by adjusting for the little things.

You touch on the intersection of corporate or operational security issues and info security. Ira, you have a story where you were doing penetration tests at a client company and were able to walk out with critical engineering documents that you found—not in the engineering department but in the graphics department.

Winkler: Right. The CEO has the graphic arts department at his beck and call, and its responsibility is to make documents look pretty. Now, the graphic arts people think of themselves as artists; they're not thinking about, "Hey, I have some of the most valuable documents in the company on my server." Obviously, if you go to the financial group and say, "I want to see your financial data," they'll laugh you out of the office. But if you go to the graphic artists and say, "Can I take a look at your computers for a minute?"&mash;they'll say, "Sure, why not." So people have to understand that there are many places where valuable data goes. And, ironically, some of the most valuable data gets sent to places where they think the data's irrelevant.

That makes an argument for active cooperation of all security groups. It also makes a case for the concept of Defense in Depth: Deemphasize the perimeter-oriented approach to security and start thinking in terms of layers of internal defense.

Winkler: Defense in Depth is actually a Department of Defense concept. The DoD has been using it for a long time. Most people start thinking of defense at the perimeter, but Defense in Depth [advocates] treat each piece of the network as its own. It's not a new term, but it's getting more publicity as more defense people end up in private industry. It's a darn good term.

If you adopt Defense in Depth, you eliminate the debate about which constitutes the bigger threat—internal or external breaches—which seems like a pointless question anyway.

Winkler: At one level, it's pointless, because I've always said threat is irrelevant. It's irrelevant whether they're a teenager, an insider or an outsider—someone is going to try to get you. But different threats do have different levels of resources they can throw at you. Teen hackers may scan your website for a while, and then maybe they make a phone call to try some social engineering. But then they go away. However, if you are a [financial sector] company, you are also potentially threatened by outsiders who want to steal money. And if you're talking about, potentially, more organized criminals or competitors, they will get a job inside your company or, more likely, recruit someone who's already inside to steal information for them. So you have to do Defense in Depth.

Back to the money question. We have written several articles saying that CSOs need to do a better job quantifying the cost of a breach, return on security investments (ROSI) and so on. Donn Parker, of SRI International fame, wrote in to say that that's the wrong approach; it's really about due diligence. A lot of people say you can't calculate ROSI. Is it a red herring?

Winkler: There's a big difference between due diligence and security. Due diligence says I might suffer a loss, but nobody can sue me for it. Security, instead, needs to be approached from the standpoint of balancing my risk. If there was some great standard out there, some good laws that said here's what you must do specifically in terms of information security, then taking a due diligence approach might be acceptable.

But if I'm a good security person, I have more to worry about than just preventing a lawsuit; I'm supposed to supply a good cost-benefit to my company. I need to keep it not only out of court, but profitable. I would argue that, theoretically, Enron might have done due diligence, but we all know where it ended up. Due diligence basically says that as long as your CEO can't be sued if the company goes bankrupt, you're fine.

Let's talk more about standards and regulations. We recently surveyed readers about whether, since budget justification is so difficult, there should be more regulation. We got a very mixed response.

Winkler: You have to realize that a regulation, if nothing else, is going to [apply] a uniform standard across a large number of computers. It's never going to be perfect, but it can be reasonable. If you want good [proposed] regulations, here are three.

First is to configure systems according to an acceptable guideline from, say, the Center for Internet Security, from the National Security Agency or from the vendors—freely available [specifications] that have gone through industry peer review.

Second, manage [systems] correctly with a patch-management program. Fixing bugs within, generally, three months allows you to be relatively secure. If you graph the CERT Coordination Center data, most exploits begin to rise after about three months. The activity hits a peak and then comes back down around six months. So that means if you fix a vulnerability within one to three months, the likelihood of your being exploited is acceptable.

Third, network administrators should be reasonably well trained. When computers were first coming out, I [heard about] a company that took its secretary and said, "OK, you know Microsoft Word and Excel, so we're making you our Unix administrator." True story. That's the type of environment we were in. But today, just as you need well-trained mechanics to fix an airplane, you need well-trained administrators to maintain your systems. Some companies are going to say, "I can't afford to send my people to a class to learn how to do this well." But, to me, if you can't afford to do the basics right, you're not offering a secure service to your customers, and maybe you shouldn't be in business.

In raising the notion of "reasonable regulations," you talk about basing regulatory decisions on historical data such as the CERT diagrams. Another analogy that might be useful is the process of legally mandated auto inspections. You have to maintain a car to certain benchmark specs, and you ought to maintain your computer systems similarly.

Winkler: By installing your computers well, you can keep them up and running. Turning off unnecessary processes makes the systems more efficient. This is where security is increasing performance. People lose track of the fact that patches don't all have to do with security [vulnerabilities]. They sometimes have to do with functionality. Doing a security program makes your systems more functional, more stable.

Unfortunately, better patching alone won't make information security work. Looking over the PricewaterhouseCoopers global survey results (see "The State of IT Security 2003"), the only clear conclusion is that corporate infosec is a mess. There's a bizarre lack of correlation between spending and efficacy, for example.

1 2 Page 1
Page 1 of 2
7 hot cybersecurity trends (and 2 going cold)