• United States



Contributing writer

Security experts push back at ‘Cyber Pearl Harbor’ warning

Nov 07, 20125 mins
Critical InfrastructureCybercrime

The only effective defense is to 'build security in' from the ground up, critics say in response to DoD, DHS comments

The nation’s top national security leaders have convinced President Obama and much of the leadership in Congress that the U.S. is at risk of a “Cyber Pearl Harbor” or “Digital 9/11” if it does not take drastic measures to improve both defensive and offensive cybersecurity capabilities against hostile nation states.

But the leaders, Defense (DoD) Secretary Leon Panetta and Homeland Security (DHS) Secretary Janet Napolitano have not, however, convinced every expert in the cybersecurity community, and there is now some increasingly vocal push-back from some of them.

Critics argue argue that not only is the threat of a catastrophic cyberattack greatly exaggerated, but that the best way to guard against the multiple risks they agree exist is not with better firewalls or offensive strikes against potential attacks, but to “build security in” to the control systems that run the nation’s critical infrastructure.

Bruce Schneier, author, Chief Technology Security Officer at BT and frequently described as a security “guru,” has not backed off of his contention made at a debate two years ago that the cyber war threat “has been greatly exaggerated.” He said that while a major attack would be disruptive, it would not even be close to an existential threat to the U.S.

“This [damage] is at the margins,” he said, adding that even using the term “war” is just a, “neat way of phrasing it to get people’s attention. The threats and vulnerabilities are real, but they are not war threats.”

[See also: Following Sandy, DHS seeks security ‘Cyber Reserve’]

Gary McGraw, CTO of Cigital, recently argued that while existing control systems are “riddled with security vulnerabilities” since they are outdated and were not designed with security in mind, trying to protect them with a preemptive attack against a perceived threat would be both dangerous and fruitless.

McGraw, who has been preaching the “build-security-in” mantra for years, is highly skeptical of claims that government is now much better at “attribution” — knowing exactly who launched an attack.

“If they have solved it, they need to tell us hard-core security people how they did it, because we don’t really believe them,” he said, noting that a major retaliation against a party that didn’t launch an attack could be more catastrophic than the initial attack. “Proactive defense,” by eliminating the vulnerabilities in the control systems, is a much better approach, McCgraw argues.

Besides the attribution problem, McGraw wrote that cyber-offense capabilities of an adversary are unlikely to be knocked out by an attack. Quoting estimates from Ralph Langner, the security consultant credited with cracking the Stuxnet malware, he said that while it takes $90 billion to develop a nuclear submarine fleet, a cyberweapons program aimed at hardened military targets would cost more like $1 billion. And a single-use attack against critical infrastructure might cost as little as $5 million, he said. 

Creating such “cyber-rocks,” he said, is cheap. “Buying a cyber-rock is even cheaper since zero-day attacks exist on the open market for sale to the highest bidder.”

So, it makes no sense to, “unleash the cyber-rocks from inside of our glass houses since everyone can or will have cyber-rocks,” he wrote.

Besides Schneier and McGraw, Jacob Olcott, principal at Good Harbor Consulting and past counsel and lead negotiator on comprehensive cybersecurity legislation to Sen. Jay Rockefeller (D-WVa.), pointed to a paper he authored in May that “suggests that owners and operators of critical infrastructure can achieve long-term cost savings and significantly reduce cyber risk by adopting secure development.”

Why isn’t that concept more persuasive to national security leaders in Washington?

Schneier has said for years, and said again this week, that cyberattack threats are “being grossly exaggerated for a reason” and “about money and power.”

“There is an enormous amount of money in government contracts, and the real money is in scaring people,” he said.

McGraw said that military leaders “are interested in offensive stuff because they think like the war fighters they are.” In his paper, he contends that offense is sexier than defense.

“One of the problems to overcome is that exploits are sexy and engineering is, well, not so sexy,” he wrote. “I’ve experienced this first hand with my own books. The black hat ‘bad-guy’ books, such as ‘Exploiting Software’ outsell the white hat ‘good-guy’ books like ‘Software Security’ by a ratio of 3:1.”

But Joel Harding, a retired military intelligence officer and information operations expert, said it may also be because not everybody in the security community agrees with the anti-offense view. “There is a giant chorus of cybersecurity experts clamoring for attention. It’s a cacophony of opinions,” he said. But he disagrees that defense alone is enough to defeat or even block an attacker.

“By its very nature, a zero-day exploit uses a vulnerability otherwise not defended against,” he said. “Until we have artificial intelligence that predicts the nature and type of future attacks and offers ways to block them, a defense is at risk.” But he does agree that attribution remains imperfect.

Olcott said the good news is that his and other voices are being heard in government. He points to a “Build Security In” page on the DHS website that advocates for building secure software, and even includes a citation of Schneier.

But Schneier said as long as “war” is the operative description, the hyperbole will continue and the response will be less effective. “When you use a war metaphor, a certain type of solution presents itself,” he said, “while a police metaphor brings a different type of solution.”

“Right now the dialogue dominated by the DoD and the spooks,” McGraw said. “If you think about security as your hands, security engineering finger might be your right pinky — it’s big enough to be a finger, but not a huge part of cybersecurity.”

“What we really need to do is revisit security engineering,” he said.