Data is the oil of the 21st century

The cybersecurity lessons of the 2016 election and the changing nature of our critical infrastructure.

monitoring systems

For all its controversy and divisiveness, the 2016 U.S. presidential election provided a powerful lesson for securing the future of our critical infrastructures. And while the same threat actors who succeeded in 2016 would relish an opportunity to school us again on their malicious cyber activity and information influence campaigns, it’s both prudent and necessary for us to avoid the refresher course. Especially with one likely being planned for the upcoming midterm elections.

When a crime is committed, there is a natural inclination to determine who did it. But effective cybersecurity is not fundamentally always about who is attacking you—it’s about making strategic decisions to protect key assets. Understanding the threat and attribution of cyberattacks is always helpful, but it is rarely timely or certain enough in an operational sense.

I was personally involved with Intelligence and White House deliberations in 2016 about attributing and reacting to the calculated and highly coordinated state-sponsored attempts to influence our national election. That election illustrates a larger and more important point about the changing nature of critical infrastructure: we need to consider certain repositories of data as being among our most important national assets—as important as physical infrastructures like power plans and water systems—whose compromise or loss of integrity can cause grave harm to a government, country or organization.

Until very recently, securing a nation’s critical infrastructures was primarily seen as protecting physical things—dams, bridges, power plants and the like. Now, critical infrastructures include important virtual things that must be protected—the data that underpins the integrity of important government functions and public and private information systems.

Especially with voting technology—a diverse collection of devices and data that couldn’t be more central to representative democracy—the ability to cast doubt on a nationally sacred process, either through the election systems themselves or the information influence campaigns directed at voters, is disturbingly easy. There are more than 50 different election systems (i.e., each state, DC, etc.), with many more operated at the local level, that feed into a national result. To be successful, hackers don’t have to change vote counts or voter rolls. To some degree, all they have to do is create enough confusion, and leave enough evidence of tampering, that election officials can’t be sure of the results

That is a precarious position for the country to be in for one of its most sacred institutions, and why many smart people are working hard to minimize that risk in 2018. But the assault on the election system is not a lesson solely for governments; it reveals the importance of our digital infrastructure and the critical necessity of consequence-based engineering.

Thankfully, cybersecurity fundamentals offer an effective path forward.

Whether you are the CISO of an enterprise-size company or an individual subject matter expert, the principle of Consequence-Based Engineering (CBE) should be at the heart of all your cybersecurity planning. With CBE, you identify, up front, the bad consequences you want to avoid. These should be specific and appropriate for your organization—whether it is to prevent employees from accessing HR information, vendors from comparing each other’s terms, or hostile foreign governments from sowing mistrust with acts of disinformation.

Once determined, CISOs and their teams can then apply the appropriate measures to minimize the overall risk they find by mitigating the vulnerability, the threat or the bad consequence itself. There is a wide range of tactics to achieve this, but central to all good security is the practice of segmentation.

The strategy behind segmentation is based on a very simple truth: the network boundary is dead. Up until about five years ago, it was common to place defenses along the border of a network. But with the advent of wireless, mobility and the cloud, a boundary-based approach simply doesn’t work. And, even when it did, it created tremendous risk by making most access decisions binary: Should access be granted or denied? Moreover, when a hacker was able to scale that wall, they were able to achieve total access, with catastrophic results.

Rather than placing one security wall around everything, segmentation allows separate but aligned micro-segments throughout the network. It separates sensitive data (e.g., proprietary, financial, privacy, marketing, operational) and other critical data so that even if an organization’s first lines of defense are breached, the scope of that breach can be contained. It is a far more effective security strategy, one that attempts to prevent compromises yet assumes that inevitably, an attack will succeed and minimizes the compromise’s scope.

The strategic and technical implementation of agile segmentation, done at both the macro and micro levels, allows security teams to make much more nimble and effective decisions about which groups of data to protect—as well as who gets access and for how long. Further, there are a host of business benefits to this that align with the principle that cybersecurity should enable business and not hinder it. But even as a security measure alone, it is incredibly valuable.

And there’s even more good news about the adoption of this strategy: It’s well within the reach of both the smallest and largest of organizations, sometimes efficiently enabled by companies that seek to minimize the complexity of implementing it.

The process of CBE that leads to a strong segmentation architecture begins with doing an inventory of your organization’s major data groups, such as: company financial data; communications with the press; contract data; enterprise IT configurations; executive deliberations; IP (e.g., design, code repositories); legal; marketing; personnel; research; and security services data (e.g., information on customer networks). 

Next, you assess how or if your network’s design keeps them separate from each other—this is where macro and micro segmentation come in—and if so, how easy it is for users to move freely between them (if users can easily jump from one to the other, so can hackers). If there is security in place to prevent that, how strong is it, and how have you determined that strength? Are there regular audits to detect breaches between data sets or snooping? What measures have been taken to identify and prevent a breach as it occurs?

Regardless of where one falls on the political spectrum, we all care about democracy.  And we have all become painfully aware at how the integrity and availability of data is essential to free and fair elections. Now we need to achieve that same national attention and urgency to protecting our companies’ and organizations’ data sets via consequence-based engineering techniques.

A hundred years ago we understood that there was no local security without the protection of oil fields. Fifty years ago we understood that there was no national security without safeguards for nuclear weapons. And today we now understand that, in our hyperconnected world, there is no global security without protecting our most valuable assets with modern segmentation techniques.

Cybercriminals and adversaries are relentless. Once they find a weakness, they exploit it with an almost compulsive intensity. They certainly don’t pat themselves on the back and call it a day after they are successful.

So, the question is not whether another attack will happen. It will. Instead, we must ask if we have acted appropriately to ensure that when it happens, our most critical assets—which are not just physical things, but data as well—the underpinnings of our data economy, are protected.

This article is published as part of the IDG Contributor Network. Want to Join?

SUBSCRIBE! Get the best of CSO delivered to your email inbox.