Security remains one of the top three concerns for nearly every enterprise, both those using primarily in-house data centers as well as those migrated to the cloud (and that's both for fully public cloud or hybrid/multicloud users). Application solution providers, machine suppliers, and cloud players have all gone to great lengths to enhance security.
But most efforts in the past at security have centered around protecting data at rest or in transit through encryption. Indeed, encryption of data while in a database, over a LAN/WAN or moving through a 5G network, is a key component of nearly every such system. Nearly every compute system, even smartphones, have data encryption built in, enhanced by specialized compute engines built into the processor chips. But one area that has been relatively ignored is the ability of all of this encryption to be defeated if a bad actor can access the device hardware through either a malicious app or a side channel intrusion. Encrypted data needs to be in the clear when processing it, and this is a real vulnerability. If you can get to the machine memory at this point, all data is available for easy viewing/copying. Eliminating this risk is the vision of confidential computing.
The Confidential Computing Consortium and standardization efforts
In 2019, the Linux Foundation launched the Confidential Computing Consortium. Its stated goal is to define standards for confidential computing as well as support and propagate the development of open-source confidential computing tools and frameworks. Members include Alibaba, AMD, Arm, Facebook, Fortanix, Google, Huawei, IBM (Red Hat), Intel, Microsoft, Oracle, Swisscom, Tencent, and VMware.
While several of these companies already have tools available, it's likely that in the future they will get rolled up into a more open sourced framework for confidential computing, given the Linux Foundation background. The foundation has stated that: "The Consortium is concentrating on the area of 'data in use,' with the confidentiality of 'data in transit' and 'data at rest' as outside the scope of the Consortium. Contributions to the Confidential Computing Consortium, according to their website, already include:
- Software Guard Extensions (Intel SGX) SDK, designed to help application developers protect select code and data from disclosure or modification at the hardware layer using protected enclaves in memory.
- Open Enclave SDK, an open source framework that allows developers to build trusted execution environment (TEE) applications using a single enclaving abstraction. Developers can build applications once that run across multiple TEE architectures.
- Enarx, a project providing hardware independence for securing applications using TEEs.
As this is an ongoing "work in progress" standardization effort, there will likely be many more projects that come up in the future. But all should eventually be embedded into an open source framework for confidential computing.
What is confidential computing?
Unlike data encryption, confidential computing uses hardware-based functions to create a trusted execution environment for data, compute functions, or an entire application. Confidential computing isolates this vaulted area from access by the OS or virtual machine and thus protects against any potential for cross contamination as no one can gain access that isn't assigned to this TEE. Any attempt to alter the app code or tamper with the data will be prevented by the TEE.
This is especially critical in multi-user systems, such as virtualized and public cloud systems, where cross contamination of data is a real risk. Indeed, some potential users of public cloud compute have resisted moving for this specific reason. While a slight chance of a side channel attack is possible if you have physical access to the hardware, the risk is trivial compared to the potential risks associated with non-confidential computing systems.
Confidential computing in the cloud
Trusted execution environments are the key to making confidential computing work. We've had TEE's for some time, including on Arm-based chips (Trust Zone) as well as in x86 chips (e.g., Intel SGX). Indeed, early versions of this concept go back more than a decade to TPM modules that were available in many PCs. The difference with modern versions of TEE is that they are built into the core of the chips and not as external add-ons that could be compromised over the interconnections.
Despite the fact that we've had TEE-enabled systems available for some time, few enterprises have sought to use them, and many app providers don't support them either. The reason is that they have always been difficult to implement and you needed specific code enabled with the app to enforce the use of a TEE environment. Further, TEEs were not universally available on all processors (e.g. some Intel Xeon chips support SGX and some don't), nor were TEEs compatible across chip families. The result is that many organizations did not implement what could be a very important security method.
With a move to off premises and multi-tenant cloud computing, there is now a greater need to protect the processing integrity of customer data, as well as protect certain proprietary algorithms running in the processes. As a result cloud providers are making it easy to spin up new confidential computing instances for customers to utilize. This eliminates the need for organizations to have their own confidential computing-enabled systems to run in. This is a win-win situation, as the customers get what they need to protect their data assets, and cloud providers bring in the necessary hardware assets that customers don't necessarily own themselves.
This new availability is being brought about but an increasing number of processors that include the confidential computing concepts built in. And as cloud providers generally obtain new high end processing capability early in tier stages of availability, this makes access for the user community much more rapid than if they had to acquire it on their own. Further, it enables app providers to quickly design confidential computing into their product given the availability of hardware and toolkits running in the cloud, and further, allows them to have a more ready market to recover their development investment.
What should companies do?
The concepts behind confidential computing are not new, but the availability of TEEs and confidential computing in the cloud make it much more attractive to organizations that need to secure their data from application vulnerabilities. I recommend that enterprises explore the use of confidential computing techniques in the next 6-12 months, and specify to their key application solution providers that they expect them to comply with the confidential computing strategy and offer technology implementations within the same time period. Confidential computing can significantly enhance enterprise security by virtually eliminating the ability of data in process to be exploited. While there is no 100% sure thing when it comes to security, confidential computing is a major step forward and should be implemented whenever possible, particularly for those organizations deploying applications in the cloud. I expect confidential computing to become a standard approach to compute, especially in the cloud, within the next 1-2 years.
Read more on confidential computing:
- Intel bets big on security as a service for confidential computing
- Google Cloud steps up security and compliance for applications, government
- IBM, Intel, AMD take different routes to hardware-based encryption