FaaS & IaaS
As more and more organizations are moving towards a serverless or Function as a Service (FaaS) architecture and framework, understanding how this affects security is essential. There are both pros and cons to implementing a serverless solution from a security perspective. This course will look at both the benefits and the negatives when adopting a FaaS solution and how this affects the safeguarding of your data.
By the end of this course, you will
- Understand and be able to distinguish between the pros and cons of serverless security
- Understand where to focus additional security controls in a FaaS solution
- Have a general overview of how security differs to that of a typical IaaS solution
This content in this course would be beneficial to:
- Engineers who are focused on delivering secure serverless solutions within an enterprise environment
- Security architects looking to enhance their knowledge of FaaS solutions
- Developers deploying applications within a serverless environment
As a prerequisite of this course you should have a basic knowledge and awareness of the following:
- A general understanding of what Serverless means
- Understand what FaaS and IaaS relates to
- A basic awareness of different attack vectors, such as DoS
- AWS Lambda
- Amazon Cognito
- Amazon API Gateway
- Security controls within IAM
This course includes
If you have thoughts or suggestions for this course, please contact Cloud Academy at firstname.lastname@example.org.
Hello, and welcome to this lecture. In the previous lecture I focused on how serverless brings a host benefits when you're looking at the security of your solution. However, it's not always a win-win scenario.
Serverless can also cause an increased negative effect for security threats and concerns which also need to be highlighted and addressed. Let me explain a few of these. Let me start out by talking about lambda functions. These functions are created to carry out a very specific task, and these functions come with permissions that often allow access to resources which are give through IAM roles. As such these functions need to be managed efficiently. As you may or may not know, you are not charged for the creation of an AWS lambda function other than a very negligible amount for the S3 storage of your code used in the function. Instead the main charge comes from the number of requests and the duration of those requests for the resource consumption that the function requires. This therefore allows developers who are creating AWS Lambda functions free room to create as many functions as they need for different events and use cases without a fear of increasing costs. Over time some of these functions will cease to be required and will no longer be used. As an administrator you may not fully understand which functions are still be used and which ones are no longer needed. This can be difficult to monitor with hundreds of functions. So when general housekeeping of functions are not upheld by your teams, it is very likely that within your lambda environment you could have a number of unused and redundant functions that are sitting idle in your live operational environment.
Another scenario that causes the same problem are functions that have been created by employees that have since left the company, and it can sometimes be difficult to determine if they are still being used or what they are being used for. Although you can check the invocations count over time and the currently active triggers, in most cases if you have zero invocations or zero active triggers, then it could mean it's an unused function. However, the key point here is that this is not always 100% fool proof. So these function are often left alone to ensure that the removal of the function doesn't cause any production issues. So if you had a number of these unused and idle functions, the fact remains that they will still have permissions associated to access resources as per the initial configuration when they were just created. These unused functions now pose a security risk if someone was able to compromise your serverless environment and invoke redundant functions allowing further access to resources that should no longer be allowed.
It's best practice to remove any unused and unwanted AWS Lambda functions. Otherwise the presence of redundant functions in operation increases your attack surface for users with malicious intent. You must keep track and record which functions are being used for what. Maintaining some kind of organization and management of your functions is key. As a part of this process, permissions must be assigned correctly and effectively to the functions. In all cases you should be assigning permissions based on the least privileged model meaning that only exact permissions to carry out a specific action should be a sign and no more. For example, you wouldn't want to assign an IAM role to a function that has full access to S3 if the functions was only to put an object into a specific bucket in S3. If this function was compromised in some way, then the malicious user could potentially gain full access to your S3 environment. Minimizing the permissions on each function minimizes your attack surface considerably.
Another security weak point comes from third part interaction with your serverless solution. If you have services throughout your deployment that request information from a third party or send data to a third party for processing before receiving data back, then there are a number of security concerns and elements you must understand such as by what means the data is sent between your environment and the third party, is the data encrypted, can you trust the third party, how is authentication managed, and are you using key management and where are the keys stored. All of these questions need to be answered to minimize the risk of your serverless environment being compromised. Although with many infrastructures and service solutions there are of course third party interactions as well.
The issue isn't just apparent with serverless solutions. However, it is more likely that you will have more third party relationships within a serverless environment, and so the security surrounding the five point mentioned above are heightened with a larger attack surface. More third parties provides more points of entry. Lastly I briefly want to highlight monitoring and logging. Serverless is still relatively new in the mainstream production environment. The acceptance of the technology and framework is being accepted more and more and implemented on a much larger scale across multiple industries. During this early adoption of serverless into the wider community the amount of varied toolsets to manage, track, and monitor, and log serverless environments at great detail are still minimal. As a general rule when you compare the level of functionality of monitoring and logging from a security perspective, there are far more vendors and toolsets out there that seemlessly work and interact with infrastructure as a service solutions over function as a service solutions. Over time this will balance out, but at the moment the balance is certainly tipped towards infrastructure as a service.
This has brought me to the end of this lecture while highlighting some of the concerns running fast security. Coming up in the next lecture I will be looking at common security patterns that exist for both infrastructure as a server and function as a service solutions.
About the Author
Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data center and network infrastructure design, to cloud architecture and implementation.
To date, Stuart has created 80+ courses relating to Cloud reaching over 100,000 students, mostly within the AWS category and with a heavy focus on security and compliance.
Stuart is a member of the AWS Community Builders Program for his contributions towards AWS.
He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.
In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.
Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.