Two of the hottest IT technologies in 2010 are virtualization and cloud computing. Both are heavily evangelized in the industry as the "wave of the future" and the "next big thing." This is primarily due to perceived promises of reductions in hardware, software licensing and maintenance costs. To a large extent, all of these claims have merit. But the overarching issue is that it is easy to get caught up in the hype of these new technologies, while being oblivious to the myriad operational and security challenges in making them work.
Just how hot is cloud computing? 2010 had barely started when HP and Microsoft announced a $250 million partnership to develop integrated data center products that HP will offer as the HP Private Cloud.
Other major cloud news includes none other than Microsoft, who announced the addition of the OS versioning feature to its recently released Windows Azure platform as a service offering. This was needed as Azure users complained about how patches and upgrades unexpectedly affected the operating systems running under Azure.
Historically, many organizations get caught up in the excitement and associated hype of the latest technologies due to the fascination with all things "new and improved." In doing so, they can easily lose sight of the risk implications of quickly and indiscriminately embracing new technologies, without first performing the requisite due diligence exercises, including at the least, a formal risk assessment.
The concept of virtualized computing is deep-rooted in the halcyon days of mainframe computing. Mainframes were then and still are expensive to install and maintain. An enterprise fortunate enough to afford mainframes in the past also had to ensure the logical separation of computing system resources and data assets of the often various, and sometime competing business customers paying hefty sums to use them.
Out of this was born the concept of a logical partition or LPAR, which was conceived and secured to ensure a dedicated virtual environment from which those customers could address various critical business computing requirements. An LPAR was simply an early abstraction, similar to what we now know today, for example, as Citrix OS virtualization. The LPAR is but a subset of a mainframe's hardware resources, virtualized as a separate computer. In effect, a physical machine can be partitioned into multiple LPARs, each housing a separate operating system.
The overall objective of this virtualization is to protect data and technology assets from unauthorized access and exposure, as well as other possible risk factors. Then, as is the case now, those who install, support and maintain such systems needed to ensure that sufficient security controls exist to properly protect critical information assets.
Server virtualization technology is here to stay, and as Gartner Group predicts, by 2012, more than 85% of enterprises will be using server virtualization extensively in production environments. But even though virtualization offers faster server provisioning, hardware utilization and lower costs for disaster recovery, there is a downside of which many organizations are unaware. Since virtualized environments are more complex than their physical counterparts to secure, if not dealt with accordingly, it can become significantly more difficult to be in regulatory compliance in virtual environments.
And today's intensive IT environments are straining under many burdens, not the least of which is to properly address regulatory and industry compliance requirements for critical data asset protection. The overall objective of most regulatory requirements is to reduce the possible risk to computing resources and assets to an acceptable level, or even minimally optimized levels.
Cloud computing and virtualization environments are no different and are often found to be tightly coupled within production environments, frequently used in parallel. As such, it is a simple stretch to consider the concept of cloud computing as virtualization on a grand scale. In fact, cloud computing is, for the most part, a large-scale implementation of virtualization technologies. It is often used by cloud service providers to employ dedicated virtual services and, more importantly, the ability to scale them to meet growing customer needs.
The PCI Data Security Standard (DSS) is one such compliance requirement which is driven by industry factors. When it comes to PCI DSS compliance of virtualized information technologies, many PCI Qualified Security Assessors (QSA) are struggling to properly interpret the implications of the standard. While the PCI DSS is updated to the intensive dynamics of today's IT environments, in its current version, there are no specific references to virtualization or cloud computing. The objective of this article is to bring some clarity to the matter.
Virtualization definitions
Part of the current challenge for QSAs and others is just getting an agreed upon definition of the specific terms.
Wikipedia defines virtualization in a fairly straight-forward manner as the abstraction of computer resources.
The NIST definition of cloud computing, which is decidedly impartial and a bit more involved, defines cloud computing as a "model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model promotes availability and is composed of five essential characteristics, three service models and four deployment models."
PCI compliance and virtualization
As of March 2010, version 1.2.1 is the most current revision of the PCI DSS and one will not find in it either of the terms cloud computing or virtualization. Both these areas are being actively addressed by the PCI Security Standards Council via various working groups While the council released the PCI DSS Wireless Guidelines in July 2009, directives around cloud computing and virtualization are not scheduled for release until 2011.
In fact, in January, PCI Security Standards Council general manager Bob Russo said the next revision of the PCI DSS, due in October 2010, will contain clarifications but no major changes to the standard. Therefore, QSAs will have to wait until 2011 for specific guidance on cloud computing and virtualization. The authors would like to strongly request that rather than wait until 2011, the PCI Council should accelerate the process to document PCI cloud computing and virtualization requirements. The longer there exists the vacuum of PCI compliance ambiguity, the more difficult it becomes to secure such technologies.
The job of a lawyer is to apply the facts to the law. While the current PCI DSS makes no mention of virtualization or cloud computing, it does establish the core concepts of securing cardholder information. With that, there is no reason that the 12 PCI requirements can't be applied to cloud computing and virtualization. Even in the absence of specific PCI directives, enterprises utilizing virtualization should still be able to provide adequate security. This can be done
by ensuring that any virtualization and cloud computing initiatives map to the overall enterprise security framework.
While there is no mention of virtualization, some people, including many QSAs, have misinterpreted this to mean that virtualization is incompatible with PCI DSS compliance. This has led to heated discussions in some organizations on how they can deploy the technology and still be PCI DSS compliant. For example, PCI requirement 2.2.1 states that an entity must implement only one primary function per server.
It is clear what the intent of requirement 2.2.1 is—and some take the letter of the law approach and misunderstand it to mean that virtualization and PCI are incompatible. This is due to the fact that the hypervisor or piece of software that allows multiple operating systems to run on the same computer has multiple virtual machines under its control. Some opine the view that a hypervisor is a direct contradiction to 2.2.1. On the other hand, the authors of this article believe that PCI DSS compliance can be achieved within a virtualized environment.
For a full mapping of the specific PCI requirements against virtualization, check out the McAfee white paper How Virtualization Affects PCI DSS Part 1: Mapping PCI Requirements and Virtualization [PDF link].
Virtualization and PCI
Virtualization is a hot area in the electronic payment space due to its many benefits, some of which are:
- reliability
- cost-efficiency
- manageability
- scalability
But with those benefits comes complexity as a virtualized environment can present non-trivial risk analysis challenges. The following are a number of the key issues around virtualization to check for when performing a PCI assessment:
- Requirement for a firewall should be included at each Internet connection and between any DMZ and the internal hosts. Today's virtualized firewall technologies can be highly distributed as standalone entities or even be host-based.
- Always change vendor-supplied defaults before installing a system on the network. This includes the hypervisor.
- Develop configuration standards for all system components, including baseline virtualized images.
- PCI requirement 2.2.1 requires than an organization implement only one primary function per server. In a virtualized environment, ensure that each functional VM is appropriately isolated, including memory and network resources.
- Protect cryptographic keys used for encryption of cardholder data against both disclosure and misuse; keys stored across VMs must be protected in accordance with PCI DSS 3.6.x
- Deploy anti-virus software on all systems and across all VMs commonly affected by malicious software.
- Ensure that all system components and software have the latest vendor-supplied security patches installed.
- Install critical security patches within one month of release, including the VM OS.
- Implement automated audit trails for all system components, including separate VMs.
- Synchronize all system clocks; ensure that NTP is properly distributed.
- Secure audit trails from individual hosts so they cannot be altered.
- Ensure segregation of data; review all controls supporting data segregation.
- Ensure segregation of applications; web-application server software should not co-exist on the same VM as database applications that store critical data.
- Ensure effective security controls; regularly test your controls for effectiveness via vulnerability assessment and penetration testing.
- Ensure that logging, monitoring, auditing and alerting enabled functionality is validated.
Virtualization security is essential. As Gartner noted through 2009, 60% of virtual servers will be less secure than their physical counterparts and 30% of virtualized servers will be associated with a security incident. Gartner also notes that like their physical counterparts, most security vulnerabilities will be introduced through mis-configuration and mismanagement. The security issues related to vulnerability and configuration management get worse, not better, when virtualized.
At the end-user level, most users will access PCI data via their desktop. Each of these desktops has an operating system that needs to be managed and patched. These operating systems also require applications and office productivity software. Finally, the localized environment may also be used to store data.
With desktop virtualization, the operating system and applications reside on a server, most often in a data center. Users connect to these desktop environments (virtual desktop) via a network-based thin client. Perhaps one of the greatest privacy benefits of desktop virtualization is that no sensitive cardholder data is less likely to be stored on the desktop.
Virtualization and security, like cloud computing and security, is a dynamic area. An excellent reference to start with is the SANS Whitepaper Top Virtualization Security Mistakes (and How to Avoid Them) [PDF link].
Desktop virtualization supports PCI by moving data off the desktop. This specifically makes compliance with PCI DSS requirements 3 (Protect stored cardholder data) and 9 (Restrict physical access to cardholder data) much easier.
DSS requirement 3.1 requires that entities keep cardholder data storage to a minimum. Desktop virtualization makes this easy as all data is offloaded to the server or secured SAN.
While PCI DSS lacks specifics around virtualization, a virtualized environment can be audited like any other environment. The key to this is to ask a lot of detailed questions during the PCI assessment. Some of the issues (which must be verified by documentation and processes) include:
- Proof that the environments and data are properly segregated (perform robust datacenter physical and logical security assessments)
- Only allow access via authorized persons
- Separation of duties
- Configuration management standards
- Logging / auditing
- Patching / vulnerability management