1. Home
  2. Training Library
  3. Amazon Web Services
  4. Courses
  5. Decoupled and Event-Driven Architectures in AWS

Introduction to AWS Lambda

Contents

keyboard_tab
Course Intro
1
Introduction
PREVIEW1m 35s
Decoupled Architecture

The course is part of this learning path

AWS Solutions Architect – Associate (SAA-C02) Certification Preparation for AWS
course-steps 29 certification 1 lab-steps 11 description 4
play-arrow
Start course
Overview
DifficultyBeginner
Duration38m
Students307
Ratings
5/5
star star star star star

Description

This course explores and introduces you to the concepts of both decoupled and event-driven architectures within AWS. It also provides an introduction to the Amazon Simple Queue Service, Amazon Simple Notification Service, Amazon Kinesis and AWS Lambda. 

For any feedback, comments, or questions related to this course, feel free to reach out to us at support@cloudacademy.com.

Course Objectives

The objectives of this course are:

  • To establish an understanding of what decoupled architecture is
  • Establish an understanding of what event-driven architecture is
  • To learn the foundations of Amazon SQS and how it is used in a decoupled environment
  • To gain an awareness of the Amazon Simple Notification Service, Amazon Kinesis and AWS Lambda to understand when and why you might implement them in an event-driven solution

Intended Audience

This course has been designed for architects who are looking to design and implement best practice solutions by utilizing services in a decoupled and/or event-driven environment.

Prerequisites

To get the most from this course, it would be beneficial to have a basic awareness of what AWS is, in addition to understanding general infrastructure and application architectures, although this is not essential.

 

Transcript

Resources referenced within this lecture:

AWS Lambda Event Sources

Understanding AWS Lambda to Run and Scale your Code

Lab: Introduction to AWS Lambda

Lab: Process Amazon S3 events with AWS Lambda

Lab: Automating EBS snapshots with AWS Lambda

 

Transcript

Hello and welcome to this lecture where we shall take an introductory look at AWS Lambda. AWS Lambda is a serverless compute service which has been designed to allow you to run your application code without having to manage and provision your own EC2 instances. This saves you having to maintain and administer an additional layer of technical responsibility within your solution. Instead, that responsibility is passed over to AWS to manage for you. 

Essentially, serverless means that you do not need to worry about provisioning and managing your own compute resource to run your own code, instead this is managed and provisioned by AWS. AWS will start, scout, maintain, and stop the compute resources as required, which can be just as short as just a few milliseconds. Although it's named serverless, it does of course require service, or at least compute power, to carry out your code requests, but because the AWS user does not need to be concerned with what's managing this compute power, or where it's provisioned from, it's considered serverless from the user perspective. 

If you don't have to spend time operating, managing, patching, and securing an EC2 instance, then you have more time to focus on the code of your application and its business logic, while at the same time, optimizing costs. With AWS Lambda, you only ever have to pay for the compute power when Lambda is in use via Lambda functions. And I shall explain more on these later. 

AWS Lambda charges compute power per 100 milliseconds of use only when your code is running, in addition to the number of times your code runs. With sub-second metering, AWS Lambda offers a truly cost optimized solution for your serverless environment. So how does it work? Well there are essentially four steps to its operation. 

Firstly, AWS Lambda needs to be aware of your code that you need run so you can either upload this code to AWS Lambda, or write it within the code editor that Lambda provides. Currently, AWS Lambda supports Notebook.js, JavaScript, Python, Java, Java 8 compatible, C#, .NET Core, Go, and also Ruby. It's worth mentioning that the code that you write or upload can also include other libraries. Once your code is within Lambda, you need to configure Lambda functions to execute your code upon specific triggers from supported event sources, such as S3. As an example, a Lambda function can be triggered when an S3 event occurs, such as an object being uploaded to an S3 bucket. Once the specific trigger is initiated during the normal operations of AWS, AWS Lambda will run your code, as per your Lambda function, using only the required compute power as defined. Later in this course I'll cover more on when and how this compute power is specified. AWS records the compute time in milliseconds and the quantity of Lambda functions run to ascertain the cost of the service. 

For an AWS Lambda application to operate, it requires a number of different elements. The following form the key constructs of a Lambda application. Lambda function. The Lambda function is compiled of your own code that you want Lambda to invoke as per defined triggers. Event source. Event sources are AWS services that can be used to trigger your Lambda functions, or put another way, they produce the events that your Lambda function essentially responds to by invoking it. For a comprehensive list of these event sources, please see the following link on the screen. Trigger. The Trigger is essentially an operation from an event source that causes the function to invoke. So essentially triggering that function. For example, an Amazon S3 put request could be used as a trigger. Downstream Resources. These are the resources that are required during the execution of your Lambda function. For example, your function might call upon accessing a specific SNS topic, or a particular SQS queue. So they are not used as the source of the trigger, but instead they are the resources to be used to execute the code within the function upon invocation. L

og streams. In an effort to help you identify issues and troubleshoot issues with your Lambda function, you can add logging statements to help you identify if your code is operating as expected into a log stream. These log streams will essentially be a sequence of events that all come from the same function and recorded in CloudWatch. In addition to log streams, Lambda also sends common metrics of your functions to CloudWatch for monitoring and alerting. At a high level, the configuration steps for creating a Lambda function via the AWS Management Console could consist of selecting a blueprint, and AWS Lambda provides a large number of common blueprint templates which are preconfigured Lambda functions. To save time on your own code, you can select one of these blueprints and then customize it as necessary. An example of one of these blueprints could be the S3 get object, which is an Amazon S3 trigger that retrieves metadata for the object that is being updated. You then need to configure your triggers, and as I just explained, the trigger is an operation from an event source that causes the function to invoke and in my previous statement, I suggested an S3 put request. And then you need to finish configuring your function. And this section requires you to either upload your code or edit it in-line and it also requires you to define the required resources, the maximum execution timeout, the IAM Role, and Handler Name. 

A key benefit of using AWS Lambda is that it is a highly scalable serverless service, coupled with fantastic cost optimization compared to EC2 as you are only charged for Compute power while the code is running and for the number of functions called. For more information on AWS Lambda and how to configure it in detail, can be found in our following course. For your own hands on experience with AWS Lambda, please take a look at our labs which will guide you through how to create your first Lambda function. 

That now brings me to the end of this lecture covering AWS Lambda. Coming up next, I shall be discussing AWS Batch.

About the Author

Students84513
Labs1
Courses78
Learning paths51

Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data centre and network infrastructure design, to cloud architecture and implementation.

To date, Stuart has created 60++ courses relating to Cloud, most within the AWS category with a heavy focus on security and compliance

He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.

In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.

Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.