The National Security Agency (NSA) had a problem familiar to any enterprise IT manager executive: it was running out of space for hundreds of disparate relational databases that contain everything from back-office information to intelligence on foreign interests. And it needed to consolidate those databases to make it easier for NSA analysts to do their job.
The NSA’s initial approach was to scale up capacity. But halfway through the process, the staff realized that simply increasing the scope of the network was not going to work. So, CIO Lonny Anderson convinced General Keith Alexander, who was then Director of the NSA and Commander of U.S. Cyber Command, to approve a move to the cloud.
Today, as the private cloud project continues to be rolled out, the agency is seeing the benefits. Tasks that took analysts days now take as little as minutes, costs have been reduced, and the management and protection of information has taken a huge step forward.
To learn about this effort, which dates back to 2009, Network World was invited to interview Anderson at NSA headquarters in Fort Meade, Md. He explained that the goal was to create an environment sufficiently large to handle the data repositories and to ensure that analysts would have the user-facing experience of one-stop-shopping that the cloud can provide.
He also pointed out that the NSA effort is part of a larger migration of U.S. intelligence agencies to the cloud. In 2011, sequestration forced the Department of Defense to absorb ``huge budget cuts,’’ says Anderson.
The agencies ``decided to economize by sharing IT services and thereby avoid a drastic slash,” says Anderson. The NSA, CIA, National Geospatial Intelligence Agency (NGA), National Reconnaissance Office (NRO), and Defense Intelligence Agency (DIA) divvied up the responsibilities, with NSA and CIA handling the cloud infrastructure; NGA and DIA taking on the desktop; and NRO focusing on network requirements and engineering services.
In addition to saving on cost, putting all intelligence community data in the same bucket is enhancing the speed, depth and efficacy of their work.
Inside the cloud
Anderson describes the private cloud as “an integrated set of open source and government developed services on commercial hardware that meets the specific operational and security needs of NSA and Intelligence Community (IC) m IC [DS1] mission partners. NSA is part of an Office of the Director of National Intelligence (ODNI) effort to migrate to a community cloud that brings together NSA’s cloud services with commercial cloud services at the classified level.”
While more details could not be obtained due to security restrictions, we learned that it is based on the same commodity hardware used by public cloud providers. It also uses open source products such as Apache Hadoop, Apache Accumulo, OpenStack, and “a variety of other tools, packages, and virtualization layers.”
It would not be a surprise to learn that the NSA’s private cloud resides within secure government facilities. Anderson says, “It saves space by combining and consolidating multiple independent services and systems. In addition, we take advantage of the economies of scale from commodity hardware and the continuous improvements by commercial markets to save space, power, and cooling; the same efficiencies used by the commercial public cloud services.”
Keeping the data secure
In the wake of Wikileaks and the Snowden leaks, it’s important to understand what is being stored and how it’s being managed. More to the point, how can it be checked for legality?
Anderson says the NSA cloud does contain data the agency acquires and uses for its missions. He adds, “How we gather and use data is actually governed by strict legal authorities and subject to very rigorous oversight. That’s important to note because the NSA’s cloud architecture and data management structure greatly improves our ability to organize and analyze data and produce quality intelligence, but also makes it easier for us to track and enforce compliance with our legal responsibilities to protect privacy and civil liberties – something we have always taken very seriously. Also, aside from mission functions, the cloud is equally suited to enable improvements in other administrative and management functions for the agency.”
His point about compliance is especially important now, after the scrutiny of the past year. It is also a lesson for the private sector, where many companies have run into trouble by losing control of confidential information.
Due to the nature of the mission, the cloud components reside across a distributed architecture in multiple geographic areas. “We can’t discuss it all in detail,” the CIO says. “But we do utilize a variety of security protocols at every layer of the architecture, as well as a robust encryption strategy. The NSA cloud brings together multiple data sets and protects each piece of data through security and enforcement of the authorities that specify its use. We do this by marking each individual piece of data with a set of tags that dictate its security protections and usage. In addition to data markings, security is applied throughout the architecture at multiple layers to protect data, systems, and usage.”
This ability of the agency to track the activities of a piece of data is, as he explains, “all about tagging and provenance of both data and people. Our team has developed a way to tag data at the cell level and, accordingly, through PKI certificates, every person. For the file, it means being able to track what happens to it as long as it is in the system. For a person, it means more than what you do with a file, it also means what you are authorized to see.”
As a result, the agency can now track every instance of every individual accessing what is in some cases a single word or name in a file. This includes when it arrived, who can access it, who did access it, downloaded it, copied it, printed it, forwarded it, modified it, or deleted it. In addition, if the data has legal requirements, such as it must be purged at five years, it will automatically pop up and tell NSA IT staff that it is ready to be purged.
“All of this I can do in the cloud but – in many cases – it cannot be done in the legacy systems, many of which were created before such advanced data provenance technology existed.” Had this ability all been available at the time, it is unlikely that U.S. solider Bradley Manning would have succeeded in obtaining classified documents in 2010.
Adapting to the changes
Anderson described the move to a cloud-based architecture as a major change at multiple levels. “Historically, a purpose-built database was needed to make use of individual data sets, forcing analysts to access many different databases and information repositories to do their job. Questions that spanned more than one data set had to be pulled together manually by the analyst. By putting all of the data into the cloud, analysts and analytic tools only need to interface with one system. Additionally, the granularity of control we get from tagging each piece of data makes it possible to bring data together that previously required separate databases to provide the necessary protections.”
Early on in the process, a problem arose: “Our analyst community came to us and said, ‘Here are the applications we use in our legacy relational databases. We just want to port them to the cloud’. Well, we found that simply porting programs and data from legacy systems straight to the cloud doesn’t always work. And, even if those applications would function properly in the new environment, they would likely underutilize the potential benefits because the cloud works differently than relational databases.”
As the agency’s cloud matured to the point of being a useable asset, Anderson noticed that analysts and developers – who work together to solve intelligence tasks – had a tendency to stick with their legacy systems. To get staff invested in the new system, the IT group created “Future Architecture Transition Tuesday” or “FAT Tuesday."
Every other Tuesday, they would take as many as 150 analysts-developer teams and tell them: “Today, you cannot use your legacy tools or repositories; you have to work exclusively in the cloud”. To support them as they struggled to accomplish their mission in the new environment, Floor Walkers, experienced analyst mentors, observed them as they worked and interceded as necessary.
When an analyst raised a hand, they stepped in to help. Anderson explained, “Initially, those were brutal. They found lots of problems but, over time, the teams got to the point where tasks that took hours or even days in the legacy systems took minutes or even seconds to do in the cloud.”
The NSA’s move to a private cloud is working on several fronts. The agency has moved much of its work from legacy systems to a cloud-based platform. The cloud’s efficiencies have reduced the danger of job-threatening budget cuts; office productivity – finding the bad guys out there – has improved; and cyber security has made a dramatic leap forward.
Dirk Smith is a freelance writer. He can be reached at firstname.lastname@example.org.
This story, "Exclusive: Inside the NSA’s private cloud" was originally published by Network World.