• United States



Hope for the Cybersilent Majority

Aug 01, 20033 mins
Data and Information Security

Sharing data on cybersecurity attacks is Crime Fighting 101.

One of the large problems that besets the information security practice is an understandable reticence among the victims of cyberattacks to come forward and share the experience with the rest of us. In industries where reputation is everything, it’s a bit of a nonstarter to contemplate standing up and declaring that Russian mobsters made off with 100,000 of your customers’ credit card numbers.

Yet, the silence of the victimized has its own harmful consequences. Our ability to prevail over cybercrime depends to a high degree on building a fund of information about the ways in which such attacks are carried out. In order to discern meaningful intrusion patterns, we need to collect enough data from which similarities can emerge. What are the attacks targeting? How many carry a payload? And what are the signature attributes of perpetrators engaged in various types of attacks? The sooner revealing data is gathered, the sooner those signatures will become apparent.

This is Crime-Fighting 101.

I recently discussed the problem with Paul Maeder, the managing general partner at Highland Capital Partners, a venture investment firm based in Lexington, Mass., who pays close attention to the security space. The challenge, he says, is to take the analysis of network breaches up a couple of notches, to the point where “intentions can be inferred.” That’s a tall order requiring an array of highly granular raw data. That kind of depth depends on forensic information-gathering and candid disclosure of the findings.

Until there’s a trusted and widely used mechanism for information-sharing about all manner of anomalous cyberactivity, efforts to defend against potentially calamitous attacks will be hampered, leaving everyone more vulnerable.

How do we provide incentives for the necessary level of cooperation? Maeder cites a potentially useful model that businesses should consider emulating: the nearly 30-year-old alliance between the FAA and NASA. Behaving as a trusted third party, NASA collects incident reports from pilots, flight attendants, air-traffic controllers, mechanics and ground personnel who were either involved in or witnessed situations affecting air travel safety. Incident reports are filed voluntarily through the Aviation Safety Reporting System (ASRS). As an incentive, the FAA grants limited immunity to those who file reports, waiving fines and other penalties for “unintentional violations” reported through ASRS. Those who do not file reports within 10 days of an incident will not enjoy immunity from the consequences of ensuing investigations.

The government-granted waiver to provisions of the Freedom of Information Act (FOIA)allowing companies to share information about security incidents without it becoming a matter of public recordis a step in the right direction. But there still is no widely trusted collection point for information about security breaches. The CERT Coordination Center at Carnegie Mellon University admirably fulfills part, but not all, of that task because it lacks any enforcement teeth. With ASRS, on the other hand, failure to disclose the particulars of an aviation misadventure could ensnare individuals in an FAA inquiry, leading to potentially unpleasant consequences. If, says Maeder, legislation established CERT as the trusted third party for vulnerability reporting and specified consequences for failing to report, that would help. But, he adds, “my guess is that such legislation would only pass if cybersecurity truly became a national security issuewhich it won’t if we keep sweeping major attacks under the rug.”

At this point, the rug is lumpy with important sweepings. We need to shed some light. And while we’re at it, that single point of data collection should aim to integrate cyber- and physical-security vulnerabilities under one roof. Then we’ll make real progress.