An Overview of AWS Lambda

Contents

AWS Compute Fundamentals
1
What is Compute?
PREVIEW1m 49s
2
Amazon EC2
PREVIEW28m 26s
5
AWS Batch
PREVIEW3m 53s
EC2 Auto Scaling
ELB & Auto Scaling Summary
14
Summary
7m 37s
SAA-C02- Exam Prep
An Overview of AWS Lambda
Difficulty
Beginner
Duration
2h 55m
Students
13673
Ratings
4.8/5
starstarstarstarstar-half
Description

Please note that this course has been replaced with a new version that can be found here: https://cloudacademy.com/course/compute-saa-c03/compute-saa-c03-introduction/

 

This section of the Solution Architect Associate learning path introduces you to the core computing concepts and services relevant to the SAA-C02 exam. We start with an introduction to the AWS compute services, understand the options available and learn how to select and apply AWS compute services to meet specific requirements. 

Want more? Try a lab playground or do a Lab Challenge

Learning Objectives

  • Learn the fundamentals of AWS compute services such as EC2, ECS, EKS, and AWS Batch
  • Understanding how load balancing and autoscaling can be used to optimize your workloads
  • Learn about the AWS serverless compute services and capabilities
Transcript

Supported Event Sources

Hello and welcome to this lecture on AWS Lambda. Before we can understand how this service can be used to run and scale your code, we first need to understand what the service is exactly. AWS Lambda is a serverless compute service which has been designed to allow you to run your application code without having to manage and provision your own EC2 instances. This saves you having to maintain and administer an additional layer of technical responsibility within your solution. Instead, that responsibility is passed over AWS to manage for you. If you don't need to spend time operating, managing, patching, and securing an EC2 instance, then you have more time to focus on the code of your application and its business logic, while at the same time optimizing costs. With AWS Lambda, you only ever have to pay for the compute power when Lambda is in use via Lambda functions. And I shall explain more on these later. AWS Lambda charges compute power per hundred milliseconds of use only when your code is running, in addition to the number of times your code runs. With subsecond metering, AWS Lambda offers a truly cost-optimized solution for your serverless environment. So how does it work? Well, there are essentially four steps to its operations. Firstly, AWS Lambda needs to be aware of your code that you need to run. 

So you can either upload this code to AWS Lambda or write it within the code editors that Lambda provides. Currently, AWS Lambda supports Node.js, Python, Java, C#, Go, and also Ruby. It's worth mentioning that the code that you write or upload can also include other libraries. Once your code is within Lambda, you then need to configure Lambda functions to execute your code upon specific triggers from supported event sources such as S3. As an example, a Lambda function could be triggered when an S3 event occurs, such as an object being uploaded to an S3 bucket. Once the specific trigger is initiated during your normal operations of AWS, AWS Lambda will run your code, as per your Lambda function, using only the required compute power as defined. Later in this course, I will cover more on when and how this compute power is specified. AWS records the compute time in milliseconds and the quantity of Lambda functions run to ascertain the cost of the service. The Lambda service itself can be found within the AWS Management Console under the Compute category. As remember, Lambda is providing a compute function for your code to run on. For an AWS Lambda application to operate, it requires a number of different elements, so I just want to take a few minutes to explain what each of these are. The following form the key constructs of a Lambda application. 

The Lambda function. The Lambda function is compiled of your own code that you want Lambda to invoke as per the defined triggers. Event sources. Event sources are AWS services that can be used to trigger your Lambda functions. Or put another way, they produce the events that your Lambda function essentially responds to by invoking it. For a comprehensive list of these event sources, please see the following link. Downstream resources. These are the resources that are required during the execution of your Lambda function. For example, your function might call upon accessing a specific SNS topic or a particular SQS cue, so they're not used as the source of the trigger, but instead they are the resources to be used to execute the code within the function upon invocation. 

Log streams. In an effort to help you identify issues and troubleshoot issues with your Lambda function, you can add logging statements to help you identify if your code is operating as expected into a log stream. These log streams will essentially be a sequence of events that all come from the same function and are recorded in CloudWatch. In addition to log streams, Lambda also sends common metrics of your functions to CloudWatch for monitoring and alerting. Now we have a high level understanding of AWS Lambda is, let me dive deeper into each of these components that I've just mentioned so you can understand how they're linked together to enable AWS Lambda to be used as an event-driven method of executing code within AWS across a serverless architecture. Join me in the next lecture where I shall be looking at the Lambda function in greater detail.

 

Lectures

Introduction

Demo: Creating a Lambda Function

Understanding Event Source Mapping

Monitoring and Common Errors

Summary

About the Author
Students
220305
Labs
1
Courses
213
Learning Paths
174

Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data center and network infrastructure design, to cloud architecture and implementation.

To date, Stuart has created 150+ courses relating to Cloud reaching over 180,000 students, mostly within the AWS category and with a heavy focus on security and compliance.

Stuart is a member of the AWS Community Builders Program for his contributions towards AWS.

He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.

In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.

Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.