• United States



Contributing writer

Cloud functions present new security challenges

Jun 04, 20189 mins
Application SecurityCloud SecuritySecurity

Cloud functions, or serverless apps, are small, fast and pop in and out of existence. So, how do you secure them?

cloud apps
Credit: Thinkstock

Serverless apps are deployed over a cloud platform and are designed to use only the amount of computing resources needed to carry out a task. They come into play when needed, and then go away when the task completes. This is great if you’re looking to maximize performance and minimize overhead in a cloud environment. Because they are small, fast and have short lifespans, however, serverless apps pose challenges to security teams.

The cybersecurity industry is still trying to come to grips with containers, those small, easy-to-deploy, pre-built little bundles of applications. Since many containers can run in a single virtual machine, each isolated from the rest, they are cheaper and more flexible than previous application deployment options.

Containers have got nothing on serverless apps, also known as cloud functions or, on Amazon, as Lambda functions. First released by Amazon and IBM in 2014 — and then by Google and Microsoft in 2016 — cloud functions are even smaller, even lighter, and even shorter lived. They’re even harder to secure.

At least with containers, there’s room in the container for the main application, plus some security software such as logging or malware protection tools. With cloud functions, there is only that one function and no room for anything else. Any smaller, we’ll just be running single lines of code in the cloud.

As with any new technology, serverless app security is often an afterthought. Too many developers blindly put their faith in the infrastructure providers to keep their cloud functions safe.

Risks unknown, expertise lacking 

There’s a lack of serverless security expertise not just in enterprise development teams but in the industry in general, says Robert Huber, chief security and strategy officer at Eastwind Networks. “Very few cyber security professionals understand micro services and cloud computing from a technical level,” he says. “Even more troubling is that most organizations do not have dedicated cyber professionals with the necessary skills to reduce risk in these environments. Now comes serverless apps.”

There’s no solid information yet about all the cyber risks of the new technology, and support from security vendors is “nascent at best,” he says. As a result, companies should be cautious when calculating the ROI of moving to serverless.

According to McAfee, serverless architectures can reduce costs by a factor of ten for some operations. That’s before all the security risks are understood. Plus, the flexible billing model of serverless applications is in itself another security risk, according to McAfee. Since the apps naturally scale and billing is based on traffic, a distributed denial of service (DDoS) attack can hit the bottom line.

The increased attack surface resulting from a larger number of small functions that are deployed quickly and at scale, and that communicate with one another across the network, add up to a big problem. According to McAfee, serverless apps are among the top five new threats of 2018.

The lure of serverless security

Companies that decide to make the plunge should keep an eye out for potential blind spots. “We see a big educational gap, especially for those just starting the journey,” says Amir Jerbi, CTO and co-founder at Aqua Security.

With serverless infrastructure, the cloud provider handles all the security of the environment. Customers just bring their applications. At first glance, that sounds like serverless is a great step forward for security, but companies need to understand the limits of what the infrastructure provider is responsible for, and how to take advantage of all the security features, such as how people are authorized to spin up new functions on their billing accounts and what monitoring is available.

“They need to understand how they can limit access, and what are the native tools they’re getting, and what they’re missing,” Jerbi says. Overall, he adds, cybersecurity should improve as a result of the move to serverless functions because the infrastructure providers have full control of the environment and can secure a lot of it for you. “You don’t need your teams to know how to deal with it anymore,” he says.

The cloud providers will harden the environment, make sure that everything is the latest, most secure version, and that all patches are applied. “Serverless practically eliminates the main source for successful exploits today — unpatched servers,” says Antony Edwards, CTO at Eggplant, a London-based digital automation intelligence company. “Such servers are using binaries with known vulnerabilities, as they did not apply the latest security updates of those dependencies. By most counts, dependencies with known vulnerabilities account for the vast majority of successful exploits today.”

With great flexibility comes great responsibility

Serverless apps, or cloud functions, can pop in and out of existence in fractions of a second. Applications can scale smoothly and cost-effectively. They’re a perfect fit for applications that are architected around microservices, says Edwards. Unfortunately, they also offer more opportunities for attackers to try to abuse these apps for evil.

First, since developers don’t need to worry about the underlying infrastructure and consequently are pushing out the applications faster without a traditional security review process, more vulnerabilities might pop up in the applications themselves. Since the serverless apps are small, discrete functions, attackers have more opportunities to attempt privilege escalation or take advantage of poorly managed application dependencies. Or they can take advantage of stolen credentials, Edwards added, to get access to the data.

It’s up to developers to make sure that database access is as limited as possible. “Avoid the temptation to give everybody access to your database — even read access — and instead only give such access to the people and systems that need it most,” he says.

On the plus side, serverless applications allow more granularity, so developers can tailor access controls to a much higher degree. Managing this is the biggest challenge developers have when moving to serverless, confirms Peter Smith, cofounder and CEO at Edgewise Networks, a cloud security company. “Controlling access between these services is a significant challenge, requiring a new model for access management,” he says.

How big is the scale of this problem? Pretty big. According to a report released in April by PureSec, a serverless security company, 21 percent of open-source serverless projects contained at least one critical vulnerability or misconfiguration, and 6 percent had applications secrets such as API keys posted in publicly-accessible locations. The top five problems, according to the company, were data injection, broken authentication, insecure configurations, over-privileged permissions, and inadequate monitoring.

It might be too big a challenge for humans to handle, says Smith. Artificial intelligence (AI) might be needed to handle this at scale. “New approaches exist that limit attack surface of serverless components using machine learning to analyze serverless dependencies and automatically generate least-privilege network controls to reduce risk and exposure,” he says. This allows only required access between trusted components, but in an automated, scalable way.

Trust, but verify

All the major cloud providers now have a serverless offering. Amazon calls it AWS Lambda functions. Microsoft has Azure Functions. Google and IBM both call it Cloud Functions.

However, it’s not always clear what exactly the underlying infrastructure is, how it works, and how it’s secured. To some extent, that’s deliberate. If the public has access to that information, so do the hackers. This also means that enterprises have to take a lot of things on faith.

The serverless functions are running in isolated environments, theoretically, but it’s still hardware and computing environments shared with multiple customers, says Bo Lane, head of solution architecture at Kudelski Security. In addition, customers can’t install their own security tools in that environment, which creates significant constraints.

“How do you monitor the input and output in a function?” Lane asks. “How are you monitoring the malicious activities going on? A lot of the tools that you would deploy in an on-premise environment or a virtual machine, you don’t have the luxury of having those tools.” Enterprise customers need to get educated on what tools the infrastructure providers make available, and advocate for better tools.

Another option is to create their own serverless environments, he says, using platforms like Apache’s OpenWhisk. In fact, that’s the platform that IBM’s cloud functions offering, Bluemix OpenWhisk, is based on. Other options include Fission, IronFunctions, and Gestalt.

Companies can also monitor the performance of their functions from the inside out. FairWarning, for example, is a security vendor that focuses on the application layer.

“We deal with application that have some form of auditing function within them,” says Kurt Long, the company’s founder and CEO. “If it’s a home-grown application, it’s producing some form of audit trail.”

All the basics still apply, he says, no matter how the application is deployed. “With serverless, there’s still data that has to be protected, business functions that have to be fulfilled, and people have to access that service,” he says. “Some things don’t change.”

It’s more important than ever before to build security in right from the start, when applications are first architected. “Security is not something you should add as an afterthought,” says Mark Little, VP of engineering and CTO of JBoss at Red Hat.

In practice, however, there is little concrete information out there yet about infrastructure vulnerabilities or whether application security will get better — or worse — with the transition to serverless apps. “Functions as a service is definitely high on the hype curve at the moment,” says Little. “Developers are very interested in using them but we’ve no real idea yet as to whether this is more about trying new things out or using them in production.”

We’ll probably be finding out soon. Last fall, a Sumo Logic survey of 1,500 customers running cloud applications showed that the use of AWS Lamdba has doubled from 12 percent in 2016 to 23 percent in 2017.

The growth rate could be taking off. According to Cloudability, the growth rate for serverless functions on AWS Lambda went from 100 percent in the first quarter of 2017, to 667 percent in the fourth quarter. That’s going to become a very large target-rich environment very quickly.