• United States



Contributing Columnist

Wikileaks causes ‘need to know’ pendulum to swing again

Dec 13, 20107 mins
Critical InfrastructureIT JobsPrivacy

Ira Winkler says security needs to practice consistent, measured incident response to avoid extreme policies and predictable overreactions

If you are in the security profession long enough, you realize that security policies are generally a pendulum that you have to live with. While the recent Wikileaks incident will cause information sharing within the Department of Defense and US Government as a whole to swing back the other way, you can bet that in a few years, it will swing back to wholesale information sharing again. Like many aspects of the world, security practices are—unfortunately—driven by the most recent events.

In the case of Wikileaks, it is generally assumed that a low level intelligence analyst stationed in a remote base in Iraq, Bradley Manning, surfed around a classified military network and downloaded all the data that he could. He then transferred all the data onto a USB drive that he was given permission to attach to the network, so he could listen to music. He then took the USB drive to a computer connected to the Internet, and sent the classified data to Wikileaks. Now, of course everyone is outraged that anyone had access to all of that classified data when they clearly didn’t need it. I am going to avoid going into my personal thoughts about the Wikileaks and Manning, but the implications for security professionals are important to understand.

First let’s consider some of the findings resulting from the 9/11 attacks. There was of course outrage that the CIA and FBI were not sharing information. There were likewise many other pieces of information that were not shared among intelligence agencies, including for example some information that the Department of State had available. The 9/11 Commission cataloged a lot of this, and in response, the Department of Defense and intelligence agencies decided that they want to make information more available. This led to the creation of one big network that everyone with a Secret clearance apparently had access to across the US Government—including an Army private located in the middle of nowhere with very limited responsibilities.

I am clearly oversimplifying the issue, but that is what it really boils down to.

The pre-9/11 lack of information sharing was described to be dangerous, and in response everyone ignored the previous incidents that created the withholding of information in the first place.

For example, there was the case of Jonathan Pollard, a Naval intelligence analyst who gave sensitive information to Israel. This was before there was a lot of computer connectivity, so he would visit other intelligence agencies to look for information. There were several other similar incidents as well. Therefore, the need to know was strictly enforced. Again, the 9/11 incidents swung the pendulum far over towards the other end.

In many ways, Wikileaks demonstrates exactly why things were the way they were pre-9/11. When you just make information available to share, without regard to the inevitable malicious insider—and a malicious insider is an inevitability, not a remote possibility—you are handing the information to your enemy. I doubt that many of the people at the Department of State realized that their diplomatic communications were accessible in a remote desert of Iraq. It appears that information sharing across the classified network was void of fundamental security practices.

Any basic auditing would have demonstrated that classified servers were essentially being vacuumed up. (See Log management basics.) Auditing would have detected that a system in the desert downloaded gigabytes of unnecessary information. There was clearly no misuse and abuse detection in place. I also assume that there audit logs were not maintained enough to perform a forensic audit to figure out what exactly Manning downloaded after the fact.

The problem is that anytime you implement a new technology, without a built in security architecture and the relevant policies, you are asking for trouble. About 15 years ago, I was a member of the C4Ipros mailing list, which was targeted to intelligence professionals. It became a practice for people to send their bio, with career and personal background to the list. There were people touting how they were in intelligence commands in sensitive areas around the world. I saw a post from someone from China, and commented that people should consider this issue, and was flamed by many people for harming the mission of the list. I advised people to check with their security officers.

In the mid-1990s, I helped develop several Intelink websites for intelligence executives. Intelink is basically an Internet for the US Intelligence Community. Many of the pages I worked on contained the biographical data of people in different offices. My first comment was, “This information has no value whatsoever from a practical perspective. Why am I putting information up that will only help people guess their passwords?” I was told to basically shut up and put the site up.

There was also a little-known incident later, in which one of the destroyers that was part of a 1998 bombing of an al Qaeda camp in Afghanistan was found to have a website that had personal details of the crew, including the captain’s hometown and detailed information about his family. The captain’s family was put in protective custody when this fact was discovered, and it was thought that terrorists might retaliate. For awhile after that event, there was a clampdown on such public Internet sites. However, just go online now and you can find even more information about hundreds of military units and ships. When we see another terrorist threat, these sites will again be banned.

Going beyond IT security related issues, you only have to look to airport security. Immediately after the Underwear Bomber incident in December 2009, nobody questioned the use of full body scanners. I flew the same Amsterdam-Detroit route several weeks after the incident, and nobody questioned the increased security required to board the plane. However a year later, during a slow news period, the media started reporting on outrage involving full body scanners and the pat downs required if someone opt out of the scanner. Then of course we experienced the printer bombs, and people are reminded of the threat. We haven’t heard of many complaints about the scanners lately, have we?

The problems security professionals experience occur when there are extreme reactions to security incidents or an extreme lack of attention to security issues when implementing new technologies. Clearly, the extreme lack of security countermeasures in the classified network allowed Manning to go undetected while downloading gigabytes of information. There was an extreme overreaction to the pre-9/11 lack of information sharing that opened up the network in the first place.

Security professionals have to find a way to get into the decision cycle to ensure that these incidents don’t repeat themselves in a different form. The way to do this is to make sure that you don’t swing the pendulum to far in the opposite direction. A measured reaction that accounts for issue at hand, while addressing the underlying security concerns, is going to minimize the likelihood of later incidents AND overreactions. This means that the countermeasures will survive the next inevitable outrage that occurs.

Likewise, your organizations have to learn to weather a slow news cycle that makes a proverbial mountain out of a molehill. Just like the airport security screening outrage has died down, so will other issues that you have to deal with, if you can contain them and guide your company to a carefully thought-out response.