"The unleashed power of the atom has changed everything save our modes of thinking and we thus drift toward unparalleled catastrophe." Albert Einstein"The unleashed power of the Internet has changed everything, and presented us with an unparalleled opportunity." Martin HellmanRecently, on a glorious afternoon, under an azure blue sky, I drove from my office at the Carnegie Mellon Silicon Valley campus in NASA Research Park up to Stanford University to have a discussion about the great challenges of our time with Martin Hellman, Professor Emeritus of Electrical Engineering and co-author of the legendary Diffie-Hellman key exchange (which opened up the door to the world of public key cryptography). Hellman had seized my attention during the annual Cryptographer's Panel at the 2009 RSA Conference earlier this year. In the midst of a discussion about "Cloud computing," with fellow luminaries, Whit Diffie, Ron Rivest, Adi Shamir and Bruce Schneier, Hellman started talking about the dangers of a very different type of cloud, i.e., the mushroom cloud. Here is my blog post from that panel session: Hellman asks, "How risky is nuclear deterrence?" "Thousands of times riskier since my analysis shows it's on the order of 1,000 to 10,000 times riskier," he posits. He encourages the audience to do a Google search on "Hellman cryptography nuclear" to drill down into his current work, and also gave out the URL for his site, nuclearisk.orgCyLab CyBlog, 4-21-09)He characterized the human race as possessing the physical powers of a god with the psyche of a 16 yr old boy. If we do not "grow up really fast and pay attention to risks before they become obvious," we face calamity beyond comprehension."Trial and error are not enough, we have to rely on forecasting ability."Hellman drew from the example of the current global financial crisis. There were repeated warnings about derivatives, he recounted; Sen. Byron Dorgan (D-ND) in 1994, Brooksley Born of the CFTC in 1998, and Warren Buffet, who sounded the alarm about "financial weapons of mass destruction" in 2002.Society, Hellman noted, never seems unable to recognize risks until it is too late, and he cited nuclear weapons proliferation, the economic crisis and data security as prime example."We risk being called Cassandras," he acknowledged, but exhorted the audience not to be dissuaded by this inevitability, because "Cassandra was always right." (On my way to the interview, I stopped by the Rodin Sculpture Garden at the university's Cantor Arts Center to stand before the great artist's Gates of Hell, and allow my mind to move above the writhing bodies which rise and fall like tumultuous waves within the masterpiece's imposing bronze frame. As stood there, a factoid over a decade old bubbled up in my psyche. In 1997, I had came across an item in Peter Neumann's invaluable Risks Digests, quoting a San Francisco Examiner story about Caging the Nuclear Genie, a new book from Admiral Stansfield Turner. In the book, Admiral Turner described an incident that happened in the pre-dawn darkness on June 3, 1980, while he was serving as President Jimmy Carter's CIA Director. "Colonel William Odom alerted Zbigniew Brzezinski at 2:26 a.m. that the warning system was predicting a 220-missile nuclear attack on the U.S. It was revised shortly thereafter to be an all-out attack of 2200 missiles. Just before Brzezinski was about to wake up the President, it was learned that the 'attack' was an illusion\u2014which Turner says was caused by 'a computer error in the system.'" (Risks Digest, Vol. 19, Issue 43, 10-29-97) Turner went on to say that this incident was not the only one. "We have had thousands of false alarms of impending missile attacks on the United States," he wrote, "and a few could have spun out of control." (ibid.) Turner's anecdote (the term doesn't really seem appropriate for something of such significance) and his admonition that this particular false alarm was not a once in a lifetime event have stayed with me over the years. It offers one of those extraordinary teaching moments for your audience. I reference the story when trying to raise the consciousness of those who pooh-pooh the threat of cyber-terrorism. If such a computer malfunction could happen accidentally, I argue, even if such incidents are rare, then it could be also generated intentionally. Either way, the consequences are unthinkable. We are, in a way, the victims of our own good fortune. We have been lulled into distraction by a dangerously disarming miracle. The "Cold War" came and went, and because the threat of nuclear annihilation had become inextricably bound up with it, when the "Cod War" ended, the unconscious assumption is that the worst of the nuclear threat ended with it. Nothing could be more untrue. Indeed, the polarized geopolitical structure of the "Cold War" allowed for clarity and insanity; clarity about the consequences, and what needed to be done (or not done) to help avoid those consequences, and insanity in regard to even considering the use of nuclear weapons as an option. Having missed the opportunity that presented itself around the Millennium, the post "Cold War" world offers even more insanity (i.e., at least three new nuclear-warheaded nations, Pakistan, India and North Korea) but none of the clarity. In a CSO Magazine piece last year, I offered "A Corporate Security Strategy for Coping with the Climate Crisis," because I feel strongly that no security, intelligence or military professional can properly assess risks and threats unless not only adding climate change at or near the top of the list of risks and threats, but also factoring in its impact on all the other risks and threats spread out along the high-end of the spectrum. Likewise, I cannot, in good conscience, write about risks and threats to organizations, communities, nations or the planet as a whole, without addressing nuclear war in the same way. Reasonable people might ask, "What is the point, what could any of us do?" "This issue has nothing to do with my professional life," you might say. Well, there is something that anyone anywhere can do, and it is both meaningful, and powerful; and I argue that it is something that risk, security and intelligence professionals can do with much more persuasiveness and gravitas than others\u2014and that is speak out, educate, raise awareness within the Board Room and throughout the work-force. When the populace itself moves forcefully on an issue, government and industry fall into line. There is a great secret in such profound change: it begins one on one, from mind to mind and hand to hand. As with many other risk and security challenges, awareness and education are more than simply vital elements to any real solution, they are the magic ingredients. I wanted to talk to Hellman, you see, because he is swinging for another fool's home run. "My wife started studying Tarot, because she was afraid of it. The church of course likened Tarot to witchcraft. And even though we are modern people, she had picked up those prejudices. She said, 'I had a fear of it, so I felt I had to learn what it was.' So she did a reading for me, and I ended up being the 'Fool.' And my first reaction was 'I am a Stanford professor, I am a smart guy, I have won all of these awards.' But then she pointed to me the positive aspects of the 'Fool,' he goes where no one else has gone, with one foot on the ground, and the other stepping off the cliff. My whole life I have been fundamentally a fool, which is often very wise, because you go against conventional wisdom, which is often wrong, and yet because as a kid, I suffered from that, at a consciousness level I denied it, but at an unconscious level I had actually reveled in it. It made me who I am today." As Hellman further elucidated his thinking around the "fool's home run," I kept thinking back to a baseball player named Dave Kingman. In his 16 years in the Major Leagues, Kingman struck out 1,816 times in 6,677 at bats. But he hit 442 home runs, and walked 608 times (mostly people pitching around him). Every time Kingman stepped to the plate, you expected to see a home run. The strike-outs didn't matter. The low batting average didn't matter. Once you had seen one of his monstrous home runs (they often left the stadium completely, in a long, high arc), you just wanted to see another. "Fool home runs don't come often. You swing at a lot of wild pitches, and you have to be foolish enough, after you have swung at ten or twenty of these pitches and each time ended up with egg on your face, to get just as excited at swinging at the tenth or the twentieth one, because if you are not excited you have no hope of taking it to its conclusion, and yet a priori, when you are confronted with it, it looks no better than all those that went nowhere." He identifies two fool's home runs he has hit in his life so far. The first of them led to the birth of public-key cryptography. "When I first started working in cryptography in a serious way, around 1970, my colleagues uniformly told me I was crazy, foolish, to do so. 'The National Security Agency had a huge budget, we didn't know how big it was in those days, but it was multi-billion dollar budget, and had been working on it for a decade, even in 1970; how can you hope to discover something they do not already know?' The second argument was, 'If you do anything good, they will classify it?' I had an answer to the first question, I said, 'I don't care what they know. It is not available for commercial exploitation. Also, it is well-established who gets credit for discovering something, it is the first to publish, not the first to discover and keep secret.' Both arguments were valid. It was foolish to work on cryptography in 1970, and yet, in hind sight, you would have to say it was very wise to be foolish." The second involved the great push toward nuclear disarmament and world peace at the end of the "Cold War." "My wife and I had become involved with a group working on the nuclear weapons issue, which was in sharp focus at the time. There was a palpable concern about the world going up in smoke. & My wife and I had the privilege having some very deep relationships with Russian information theorists, who had been here on exchange visits, and I had gone there in 1973 and 1976. We had very honest political discussions. So in 1984 we went to the Soviet Union to try to get a dialogue going between the scientific communities on new equations for survival in the nuclear age. We knew it would be impossible to have them at a public level. So it was again a foolish thing to do. And yet, there must have been some guidance from a guardian angel or a muse to send us on this mission. A year later, in 1985, Gorbachev came to power, which I never would have predicted, and at first he did not seem that amazing, but within a year he had lifted censorship and encouraged free debate. So then it made sense to do the project, but if we had waited until that point to start it, it would have taken two more years to build the trust, relationships and understanding. By starting two years earlier when it made no sense, we were in the perfect position and we were able to get the book out in six months time, a book that called for radical change in our approach to national security. Gorbachev endorsed the book. We were in on history." Now Hellman hopes he has a third "fool's home run" in him; he says his nuclear risk analysis project has "the same feeling" as the others. Here are some more highlights from our conversation. Power: Let's start in the space of cyber security. What would you like to say about the role of cryptography and encryption in cyber security? Looking back over the last 15 years, what has it done for us? What has it not done for us? What are the lessons learned? Hellman: "When Whit Diffie and I published New Directions in Cryptography in 1976 (I always talk in terms of Whit Diffie and me, because we were working together. (Ralph Merkle was also working, independently, at Berkeley, and he was involved integrally in terms of public key cryptography), we thought that widespread use of encryption was five years, or at the most, ten years away. It turns out we were wrong by a factor of 2 to 3 (or maybe 4). Visionaries see the future better than the average person, but we were too optimistic. & We have been somewhat concerned with the limited gene-pool in public key cryptography. When we developed public key cryptography, we thought there would be a wide range of choices for public key crypto systems, just as there is a wide range of choices for conventional crypto. When they did the Advanced Encryption Standard (AES) call for algorithms, they got about 15 algorithms, and they could have had more; whereas, in public key cryptography, we had the Diffie-Hellman key exchange, and the El Gamal signatures and the RSA public key crypto system. & That is a very limited DNA. In the progress of cryptanalysis, i.e., what we knew in 1976 versus what we knew in 1980 versus what we knew in 1990, there have been major advances made; none of which have actually broken these systems, but which have pushed the required key sizes upward. When I was giving lectures on this in the late 1970s, I would put up slide and propose that the key size for RSA, if you wanted to be conservative, should have been at least two thousand bits. And I pointed out that if you factored in one more advance you might need as much as ten thousand bits. Now, with the advances we have seen, one more could push us up beyond ten thousand bits. So elliptic curve needs to be looked at, but even with elliptic curve it is a more limited gene pool than we would like. It is potentially vulnerable." Power: One might have assumed that laptops would be routinely encrypted by now. Hellman: "Even more so back-up tapes, since the 1970s, I have been saying that back-up tapes should be encrypted. They are only occasionally accessed. Encrypting them if the key is stored someplace else is not a problem. With a laptop, people have to have the key. But given what has happened you would think that the cost-benefit trade off of the trouble with entering a key would be well worth it." Power: Is cost the limiting factor? Hellman: "No, it is the adolescent behavior. It is the difficulty human beings have in contemplating a world different from what they have seen. Even though they have read about these laptops being stolen, it has not happened to them. & Adults take responsibility for their actions, adolescents do not. 'I am going to go a hundred miles an hour and I am not going to kill myself.'" Power: When I do executive briefings, or sessions for general audiences, I always start with a list of the top ten risks and threats, at the top of the list are nuclear proliferation and climate change and at the bottom of the list is cyber security. I don't do it to imply that cyber risks and threats are not problems worthy of treatment as national and even global security issues, but only to level-set, to say, 'OK, we are talking about cyber security, it is a very important topic, but let's keep it in perspective.' What is your thinking about the ranking of cyber security in the overall threat matrix? And about the resources and attention committed to it, are they commensurate, disproportionate, or inadequate? Hellman: "The underlying problem for all of these [risks and threats] is this chasm between our technological power and our adolescent development. So there is an underlying theme to all of these which we need to get people to see, so that they recognize that they are not really dealing with ten different problems, they are dealing with one fundamental source of all these problems. Although it is also important to recognize that the list is not static. In 1976, when we published New Directions, automated teller machines had just come in, and the SWIFT network for transferring funds internationally had already come in, and in talks I gave I could see the potential for the day coming when buying a loaf of bread would be done with an electronic funds transfer; and, if that I happened, even if someone did not steal all the money, if they just crashed the system and brought it down, our economy could come to a standstill. If you look at the pace of our dependency on computers and communications, and project out, it is going to move up on the list. If you think about all of the potential damage that could be done now, power plants, nuclear power plants, weapons systems, it is getting harder and harder to isolate these proprietary computer and communications networks from the Internet; and people are finding ways to tunnel in. Cyber security could become an existential threat as we become more and more wired." Power: Tell us about your risk analysis project? What are you attempting with it? Hellman: "I am on the advisory board of a start-up company. I met the CFO at a holiday party, and said, 'Let me put it this way, if I told you that there was an uninsurable risk that your company faced, and there was a roughly 10% chance of it destroying your company in the next 10 years, and you could do something to reduce that risk, would you be interested?' Unless something becomes socially acceptable, it is very hard for organizations to do anything about it. To change policy, we have to get to 50% penetration. On nuclear weapons, we are probably at one-tenth of 1% in terms of really recognizing the risk. The most critical part is getting to somewhere around 5%. Getting half the population seems impossible, half the population is so entrenched in the current way of doing things. But five percent is much more do-able, it is not a magic number, it could be two percent or maybe as much as ten percent. But somewhere around five percent, the average person comes in contact with one or two people a week and talks about the issue. The first time they hear it they will ask, 'Why we should we be involved?' And underneath that is the unstated belief that if it were really a major problem everybody would be talking about it." Power: So it isn't denial? Hellman: "It is denial, but it is mass denial. We are much more herd creatures than we would like to belief, myself included. I used to think that I was not susceptible fashion. Bell bottoms were really cool in the 1970s, and now they look ridiculous. There is the same mentality with respect to what issues we pay attention to. Evolutionarily, it probably made sense. But the world is changing so fast now, that we need to find ways to speed up the propagation of more ideas that are needed. The web can really help do that. In the 1980s, when I worked on nuclear weapons, and I tried to get the public's attention, it took about two months from the time a new person became interested until the time that person could begin to propagate the idea. That person had to be educated and become comfortable talking about it. There were books, but getting someone to buy a book is a huge threshold, versus being able to talk about it to someone. Today, if a person comes to my web site, or another one, and likes what they see, even if they do not understand it all, if they have not yet integrated it, they can send an e-mail with that link, no one has to buy a book, and they can send that e-mail to a hundred people in a matter of minutes, and the time it takes to get to those people is a day or less in terms of when they look at their e-mail. The propagation, both the numbers you can reach and the speed at which you can reach them, has increased, literally, by orders of magnitude." Power: What do you see as the greatest obstacle to overcome? Hellman: "People can't get their minds around the numbers. They are mind-boggling. 'One death is a tragedy, but a million deaths are a statistic.' Even at the current levels, it's as if every other person in Palo Alto were a nuclear weapon. Imagine how you would feel if a nuclear power plant was built next to your house. Now imagine how you would feel if another one was built on the other side. And then another, and then another. And then more and more pop up until there are a thousand nuclear power plants surrounding your home, and that is the minimal risk that my preliminary analysis indicates each and every one of us faces. People would be up in arms about having nuclear weapons in Palo Alto, and yet they are not worried about missiles that are six to eight thousand miles away, could reach us in thirty minutes and aimed at us, or in submarines off our coast that could reach us in ten minutes. It is invisible. That's the problem. You have to make it visible. I came up with this idea as I was studying the economic collapse. People had been warning us, credible people, including Warren Buffet, who is supposed to be the 'Oracle of Omaha,' in 1994. But even the 'Oracle' wasn't listened to." Power: What does it means to be a Cassandra? Your evoking of this myth resonates with my personal experience, and I am sure with the personal experience of other risk, security and intelligence professionals. What does the role demands of us? How does one cope with the role and the pushback it elicits? Hellman: "Calling someone a Cassandra in our society has a highly negative connotation because she is often portrayed as a madwoman. But the story is more complex. Apollo gave her the gift of prophecy but, when she spurned his advances, he cursed her so that no one would believe her. She warns the citizens of Troy: 'Don't bring the damn horse in here.' They ignored her." You can learn more about Hellman's work at http:\/\/nuclearrisk.org.