1. Home
  2. Training Library
  3. Amazon Web Services
  4. Courses
  5. Understanding AWS Lambda to Run & Scale your code

Summary

The course is part of these learning paths

DevOps Engineer – Professional Certification Preparation for AWS
course-steps 35 certification 5 lab-steps 18 quiz-steps 2 description 3
Certified Developer – Associate Certification Preparation for AWS
course-steps 29 certification 5 lab-steps 22 description 2
Serverless Computing on AWS for Developers
course-steps 12 certification 1 lab-steps 8
Get Started Building Cloud Solutions
course-steps 14 certification 5 lab-steps 1
Getting Started with Serverless Computing on AWS
course-steps 8 certification 1 lab-steps 9
more_horiz See 2 more

Contents

keyboard_tab
AWS Lambda
1
Introduction
FREE4m 33s
Summary
6
Summary
8m 34s
play-arrow
Start course
Overview
DifficultyBeginner
Duration51m
Students731
Ratings
5/5
star star star star star

Description

Learn and be able to implement solutions and applications using a serverless architecture with this AWS Lambda Course from Cloud Academy. By running your enterprise with this feature in mind you will be able to operate a more efficient environment and reduce overall costs. Gain an insight into the key compute serverless services used by AWS and understand the configuration and management involved.

This Course is made up of 6 lectures, including a 24-minute demo video on how to configure a function.

 

For more related training content, try out the Learning Paths below.

Learning Objectives

  • Be able to explain what AWS Lambda is and what its uses are
  • Define the components used within Lambda
  • Explain the different elements of a Lambda function through its creation
  • Understand the key differences between policies used within Lambda
  • Recognize how event sources and event mappings are managed for both synchronous and asynchronous invocations
  • Discover how Amazon CloudWatch can monitor metrics and logs to isolate issues with your functions
  • Learn how to check for common errors that might be causing your functions to fail

Intended Audience

  • This course has been designed for those who are new to AWS Lambda, but are keen to understand how serverless architectures can help with their solutions.
  • If you are a developer or responsible for maintaining and managing your serverless environments then this course would be advantageous.
  • If you are studying for the AWS certified Developer - Associate certification then this course is also recommended as AWS Lambda is covered heavily.

Prerequisites

  • As a prerequisite of this course, it is recommended that you have an understanding of what is meant by serverless architectures.
  • If you are unfamiliar with this term, then please take our existing course ‘What is Serverless computing’ within our ‘Getting Started with Serverless Computing on AWS’ Learning Path.
  • You should also be familiar with a programming language supported by AWS Lambda, these include:

- Node.js

- Python

- Java

- C# (.NET Core)

- Go

- Ruby

Related Training Content

Transcript

Introduction to AWS Lambda

Process Amazon S3 Events with AWS Lambda

Configure Amazon DynamoDB Triggers with AWS Lambda

Create Scheduled Tasks with AWS Lambda

Hello, and welcome to this final lecture, which will highlight the key points taken from the previous lectures within this course. I began by giving an introduction to the service itself. And here I explained that AWS Lambda is a serverless compute service designed to run application code without having to manage and provision your own EC2 instances. You only ever have to pay for the compute power when your Lambda functions are running to the closest 100 milliseconds. In addition to compute power, you are also charged based on the number of times your code runs. To use Lambda, there are four steps required. You must either upload your code to Lambda, or write it within the code editor that Lambda provides. You need to configure your Lambda function to execute upon specific triggers from supported event sources, and once your trigger is initiated, Lambda will run your code as per your Lambda function, using only the required compute power as defined.

 And lastly, AWS records the compute time in milliseconds, and the quantity of Lambda functions run to ascertain the cost of the service. Lambda is found within the AWS Management Console under the Compute category. And a Lambda function is compiled of your own code that you want Lambda to invoke. Event sources are AWS services that can be used to trigger your Lambda functions, and downstream resources are resources that are required during the execution of your Lambda function. And log streams help to identify issues and troubleshoot issues with your Lambda function. Following this lecture, I then focused on Lambda functions themselves, explaining how to create them, and what each of the configurable components were. This lecture covered a lot of elements, and in this lecture we learned that Lambda supports the following languages: Node.js, Java, C Sharp, Python, Go, PowerShell, and Ruby. You can import code into Lambda by creating a deployment package, and Lambda will need global read permissions to your deployment package to perform the import function. You can upload your code using the Management Console, AWS CLI or the SDK, and if you created your code from within Lambda itself, then Lambda would create the deployment package for you. There are three different options when creating a function. You can author it from scratch, use a blueprint, or use the serverless application repository. 

You must provide the name of your function, the run time, and the IAM role to be used to create your function. The designer window in the function allows you to configure triggers, and a trigger is an operation from an event source that causes the function to invoke. Configured triggers are then added to the design window. To view policy information for the execution policy, and the function policy, you can select the key icon in the design window, and the role execution policy determines what resources the function role has access to when the function is being run. The function policy defines which AWS resources are allowed to invoke your function, and the function code window allows you to define, write, and import your code. The handler within your function allows Lambda to invoke it when the service executes the function on your behalf, and it's used as the entry point within your code to execute your function.

 Environment variables are key value pairs that allow you to incorporate variables into your function without embedding them thoroughly into your code. By default, AWS Lambda encrypts your environment variables after the function has been deployed using KMS. Basic settings allows you to determine the compute resource that you want to use to execute your code, and you can only alter the amount of memory used. AWS Lambda then calculates the CPU power itself, based off of this selection. The function timeout determines how long the function should run before it terminates. And by default, AWS Lambda is only allowed to access resources that are accessible over the internet. To access resources within your VPC requires additional configuration. 

The execution role will need permissions to configure ENIs in your VPC. A dead-letter queue is used to receive payloads that were not processed due to a failed execution. And failed asynchronous functions would automatically retry the event a further two more times. Synchronous invocations do not automatically retry failed attempts. Enable active tracing is used to integrate AWS X-Ray to trace event sources that invoked your Lambda function, in addition to tracing other resources that were called upon in response to your Lambda function running. Concurrency measures how many functions can be running at the same time, with a default unreserved concurrency set to 1,000. AWS CloudTrail integrates with AWS Lambda, aiding with auditing and compliance. 

Throttling sets the reserved concurrency limit of your function to zero, and will stop all future invocations of the function until you change the concurrency setting. Lambda qualifiers allow you to change between versions of an alias of your function, and when you create a new version of your function, you're not able to make any further configuration changes, making it immutable. An alias allows you to create a pointer to a specific version of your function. By exporting your function, you can redeploy at a later stage, perhaps within a different AWS region. And by creating a test event, you can easily perform different tests against your function. Following this lengthy lecture on Lambda functions, I then expanded in greater detail the use of event sources and mappings. And within this lecture, I explained that an event source is an AWS service that produces the events that your Lambda function responds to by invoking it. Event sources can either be poll or push-based, and at the time I'm writing this course, the current poll-based event sources are Amazon Kinesis, Amazon SQS, and DynamoDB. 

Push-based event sources cover all the remaining supported event sources. An event source mapping is the configuration that links your event source to your Lambda function. And with push-based event sources, the mapping is maintained within the event source. Poll-based event source mappings are held within your Lambda function, and by manually invoking that Lambda function, you have the ability to use the invoke option, allowing you to invoke it synchronously or asynchronously. Synchronous invocation enables you to assess the result of the function before moving on to the next operation required, and asynchronous invocations can be used when there is no need to maintain an order of function execution. When event sources are used to call and invoke your function, the invocation type is dependent on the service. Poll-based event sources always have an invocation type of synchronous, but with push-based event sources, invocation types are dependent on each service. Finally, I explained how you can monitor and troubleshoot issues with your Lambda functions. And during this lecture, we learned that statistics related to your Lambda functions are by default, monitored by Amazon CloudWatch. 

And CloudWatch uses the following metrics: invocations, errors, dead letter errors, duration, throttles, iterator age, concurrent executions, and unreserved concurrent executions. In addition to these metrics, CloudWatch also gathers log data sent by Lambda. And each function relates to a different log group. The log group name is defined as AWS Lambda, and then the function name, and it's possible to create custom logging statements into your function code, which are then sent to CloudWatch logs. Common issues as to why your function might not run relate to permissions. And you should check your IAM role execution policy and function policies to ensure the correct access has been issued to run your function. 

That has now brought me to the end of this lecture, and to the end of this course. You should now have a greater understanding of AWS Lambda and how the service is configured and can be used within your environment to help create serverless applications using minimal compute resources. As mentioned at the start of this course, I recommend that you take the following labs to put this theory knowledge into better practice. Introduction to AWS Lambda, Process Amazon S3 Events with AWS Lambda, Configure Amazon DynamoDB Triggers with AWS Lambda, and Create Scheduled Tasks with AWS Lambda. If you have any feedback on this course, positive or negative, please contact us by sending an email to support@cloudacademy.com. Your feedback is greatly appreciated. Thank you for your time, and good luck with your continued learning of cloud computing, thank you.

About the Author

Students53415
Labs1
Courses55
Learning paths36

Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data centre and network infrastructure design, to cloud architecture and implementation.

To date, Stuart has created 50+ courses relating to Cloud, most within the AWS category with a heavy focus on security and compliance

He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.

In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.

Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.