As the pace of technological change in cloud data centers speeds up, the list of endeavors in cloud data centers grows longer. A sampling of that list includes Internet of Things, innovation, big data, mobile and social access, and SDN/NFV.
Change is always risky. Fast change is even riskier, leaving less time for change management and checks on changed technologies.
CSO presents a view into common risks associated with cloud data center change and the technological salve to safeguard data given these risks.
Common risks with cloud data center change
IoT devices are tiny, requiring vendors who can’t find existing well-tested OSs and software for them to custom code their own, says Mat Gangwer, Information Security Analyst, Rook Security. “Attackers often reverse engineer the software to find vulnerabilities that stem from a lack of code control and QA,” explains Gangwer. That’s only one kind of IoT vulnerability.
The more the IoT devices, device types, types of data, and traffic from devices to the cloud data center, the more challenging it is to govern what devices talk to what entities. “Due to the nature of the device sprawl, the many different OSs, the intricacies of IoT devices, and the sheer number of producers getting in the IoT game, it quickly becomes nearly impossible to maintain command and control of these assets,” says Dr. Chase Cunningham, Threat Intelligence Lead, Armor (formerly FireHost).
Enterprises are innovating more than IoT in cloud data centers. They often hire contractors who come in to the cloud to develop application innovations. To do this, these coders set up development resources using VMs. That’s all good. Leaving those development systems up and running after they’re done is not. It creates unnecessary sprawl and an attack surface that the enterprise may not know is there, open, and vulnerable. “Those machines sit there waiting to be told what to do, providing an avenue for someone to compromise and exploit the enterprise. There have been a few instances of that that have caused some pretty major breaches over the last few years,” says Cunningham.
Breaches also result from Big Data analytics, processing, and infrastructure technologies, which provide a motive for attackers in the form of intellectual property and a means to enter and wrench it from the organization’s grasp. Big Data tools like Hadoop and MapReduce are relatively new and carry additional dynamic ingress and egress points that cradle vulnerabilities that unlawful hired gun hackers know only too well. “There is no better resource to go after to gain a foothold and cause detriment to a company than this,” says Cunningham.
Speaking of ingress and egress, mobile and social cloud data access is a huge doorway into cloud data centers. With increasing instances of mobile malware online and attackers reaching in through holes in social media, mobile (and social) has become the rudder that can direct a boatload of corporate data into the wrong hands. “All it takes is one person in your environment to run malicious code while synced to their Google Drive with your company data in it and it’s game over, and all that without breaching your endpoint, without going through malware scanners, and without using phishing emails,” says Cunningham.
Dr. Chase Cunningham, Threat Intelligence Lead, Armor (formerly FireHost).
Mobile is not the only attack vector to make a lot of access available with minimal effort. SDN and NFV put a lot of network control in the hands of software. In these environments, the attacker need only modify some code to gain resources that turn the keys to the network kingdom, says Cunningham. “They can then route traffic anywhere they like, using a virtual router or switch to port off the entire data stream if they so desire,” says Cunningham.
Safeguarding data in this environment of change
To protect the burgeoning load of IoT devices and the data they create and transmit, the enterprise must first set strict control policies and methodologies for how they will build and implement the affected cloud resources, says Cunningham. To assess the change that results from this implementation, the enterprise must first establish and then continually update a baseline inventory of its technologies. The enterprise can’t know what changed unless it knows what it started with. The company should catalog how each device communicates as well, whether via Bluetooth for example. “If you don’t have a pulse on what your infrastructure looks like and you start scaling across cloud infrastructure, the battle for control is already lost,” says Cunningham.
To prevent outsourced developers from creating a brand spanking new attack vector with every innovation, the enterprise should know what its infrastructure looks like before, during, and after such projects. That way, the organization can ensure that anything that should not remain after completion is shut down, turned off, or removed and that all openings created to enable the work, whether developer VMs or remote login credentials, are closed.
To close the doors to the golden city that is big data, enterprises should protect common points of vulnerability such as login screens and credentials by requiring two-factors of authentication. Monitoring can also help. “Monitor those big data repositories for anomalous access,” says Cunningham.
Because many big data technology openings are intended out of the box for enterprise accessibility and convenience, it takes some customization, making sure the configurations are set up correctly to lock those tools down appropriately, says Gangwer.
Further, big data takes advantage of cloud data replication, moving data from one data center location to the next. “This begs the question whether that data is encrypted in transit,” says Gangwer. It should be. If it’s medical data, it must be.
Regardless of how attackers get in, such as through mobile and social access, they have to get data out for it to be of use to them. Data Loss Prevention tools can help here. “A really beneficial technology combination is dedicated partitioning for mobile devices—using corporate repositories or containers—and requiring two-factor authentication to gain access to the corporate data side,” says Cunningham.
To ensure fewer vulnerabilities in open source software the enterprise selects for SDN or NFV, make sure to only use technologies that are well vetted and in broad use across the industry. “One bizarre piece of code or one bizarre piece of SDN technology that does something useful is not worth it if there isn’t a consensus that it is secure,” says Cunningham. The enterprise should also test these environments using pen testing to ensure access is restricted.