• United States




One year later: security debt makes me WannaCry

May 18, 20185 mins
Critical InfrastructurePasswordsSecurity

WannaCry rocked the world one year ago, but there are still lessons for us to unpack about the debt we still have to pay to be secure.

01 day breach calendar year
Credit: Thinkstock

It is hard to believe that it’s already been a year since the WannaCry malicious software was released upon the world. Thousands of systems were destroyed and many took the option of paying the attackers for the key to decrypt their unstructured data. Files that, for whatever reason, had not been backed up. All of this because of SMB v1 was, and in many cases, is still in use today.

The carnage that was wrought on systems around the world allegedly executed by the Lazarus Group serves to underscore not that the Shadow Brokers had released purloined NSA tools, rather this highlighted a far more sinister aspect of enterprises which has lingered in the dark. That problem is the burgeoning security debt that has accumulated over years of missed patches, derelict administrative level accounts and accepted risks that have never been resolved.

The victims of this attack were not merely corporate systems that had spreadsheets and projections cryptographically locked. There were healthcare facilities that were compromised as well when their own defenses failed to provide them protection.

The severity of the situation can best be illustrated by the surgeries in hospitals that had to be postponed is one salient example. Imagine if someone had expired as a result of these postponements. While criminal groups with various nefarious intentions consume the headlines with the latest tools to be leaked, we must look at how we collectively managed to arrive at this point in time.

Hurdling the deadline wall

Over years of scrambling to meet unachievable timelines to deliver projects, many organizations have seen projects side-step any sort of security oversight in order to get to market on time. It is just these sorts of unrealistic time-to-market issues that have left us holding the proverbial bag or security issues that have accumulated over time. One overly utilized practice is that of accepting the risk. This occurs when an organization deems it is far less of a risk to the organization to make note of the issue and accept it as a known problem rather that to remedy the situation.

The problem here is that these accepted risks are seldom revisited again once they have been noted. As a result, a problem that may have been seen as being manageable at a point in time can grow in time into something worse. Especially when you consider that a potential vulnerability can be made worse in concert with another vulnerability that may be discovered later on or though the simple application of the law of unintended consequences.

The end of “open sesame” passwords

In addition to risks that may have been signed off on, we have the issue of administrative-level accounts that may have become unused or no longer necessary with the passage of time. These accounts may have easily-guessed passwords as just one scenario. Or even worse is the default admin account. One organization that I did some work for many years ago had default administrative-level credentials on every system on the network. To make matters even worse, this was one of the worst kept secrets within the organization.

This gives way to a discussion on the need for multi-factor authentication as a logical replacement for static passwords. If you have a web-delivered application, there is no solid rationale as to why there isn’t at least two-factor authentication, or a multi-factor authentication gateway to marshal access to it. But, that’s a longer conversation.

Years ago I worked for critical infrastructure company that had a control system vendor in-house with a hard coded password in their software. This was an untenable point for us as an operator and could not in good conscience leave that in place knowing the potential complications that could arise if that were to be discovered by an adversary.

We put it to the vendor and asked for them to fix the password issue and their response was that it would be hundreds of thousands of dollars to change the hard-coded password. This was simply amazing to us as a customer. How the audacity of a vendor could be so poignant was unfathomable. Customers need to hold their suppliers’ feet to the fire so that problems like this do not persist in the future.

Installing the complacency patch

Now that we find ourselves contemplating the issues of the WannaCry debacle a year later, we have to tackle the security debt. We need to a get long concerted look into the risks that have been accepted in the past and where they are on the path to being remedied. I’m willing to bet that there has been little to nothing done with risks that were accepted more than a year before today’s date. Anything older than that has a tendency to slip into the mists of time.

To put a fine point on the discussion, WannaCry is a problem that never needed to happen and we can do so much better than we have in this regard.


Dave Lewis has over two decades of industry experience. He has extensive experience in IT security operations and management. Currently, Dave is a Global Security Advocate for Akamai Technologies. He is the founder of the security site Liquidmatrix Security Digest and co-host of the Liquidmatrix podcast.

The opinions expressed in this blog are those of Dave Lewis and do not necessarily represent those of IDG Communications, Inc., its parent, subsidiary or affiliated companies.

More from this author