Skip to main content

AWS Lambda: an introduction and practical walkthrough

With Cloud Computing replacing layer after layer of server room hardware with virtual servers, what if you could virtualize the servers themselves out of existence? In a way, this is AWS Lambda.
It’s not uncommon to require your cloud-based apps to wake up and deliver some functionality when triggered by external events, but designing the process can be complicated. For example, I might need my application to respond every time there’s a change to the objects in one of my S3 buckets. Normally, I would configure some kind of service bus (in AWS, that would be SQS) to listen for an S3 change notification, so that my app code, which is listening to SQS, can respond. All that can certainly work well. But managing the code and compute resources carries a significant operating overhead.
To address such challenges, Amazon created AWS Lambda, a service that can run your code in response to events and automatically manage the compute resources for you.

Events that can trigger a Lambda function

You can configure these events to trigger Lambda functions:

  • Table updates in Amazon DynamoDB.
  • Modifications to objects in S3 buckets.
  • Notifications sent from Amazon SNS.
  • Messages arriving in an Amazon Kinesis stream.
  • AWS API call logs created by AWS CloudTrail.
  • Client data synchronization events in Amazon Cognito.
  • Custom events from mobile applications, web applications, or other web services.

AWS Lambda works using one of two event models: a push event model, or a pull event model. Lambda functions can be written in either JavaScript (or Node.js) and Java (Java 8 compatible).

How is AWS Lambda different from Amazon’s Elastic Beanstalk or EC2 Container Service?

When I first read about AWS Lambda I was confused. I wasn’t sure whether it was another PaaS (Platform as a Service) or a Docker-like container service like AWS ECS. In both those cases, developers push their code, and the rest (including compute deployment and application container provisioning) is taken care by the service. So what’s all the fuss about Lambda?
But I eventually became aware of some key differences that help differentiate Lambda from all the others. Look more closely at Amazon’s EC2 Containers. Even though containers are highly scriptable, you are still responsible for maintaining them through their lifecycles. Since ECS only provides runtime execution services, everything else is in your hands. Lamdba functions, on the other hand, are far more self-sufficient. Therefore, while Lambda has some features in common with EC2 Containers, it’s obviously much more than that.
Ok. If it’s not a container service, then perhaps it’s a platform like Elastic Beanstalk? Clearly not. Though Lambda does provide a kind of platform for developers, its much simpler than Beanstalk. Once your Lambda application is deployed, for instance, it can’t be accessed from the public network – unlike Beanstalk apps which can be accessed via their REST endpoints.
So in short, Lambda inherited some features from the EC2 Container Service and others from Elastic beanstalk, but it’s conceptually distant from both.

What does AWS Lambda do?

Now that we’ve got a bit more clarity about what AWS Lambda is, we can discuss ways to use it. Here are some common needs:

  • Application developers writing event-driven applications want seamless integration between their AWS-based applications.
  • Streaming data from AWS services like Kinesis and Dynamo DB needs processing.
  • AWS Lambda can be configured with external event timers to perform scheduled tasks.
  • Logs generated by AWS services like S3, Kinesis, and dynamoDB can be dynamically audited and tracked.

It might be helpful to take these Lambda features into account as you decide if this service is right for your project:

  • AWS Lambda works only within the AWS ecosystem.
  • AWS Lambda can be configured with external event timers, and can therefore be used for scheduling.
  • Lambda functions are stateless, so they can quickly scale.
  • More than one Lambda function can be added to a single source.
  • AWS Lambda is fast: it will execute your code within milliseconds.
  • AWS Lambda manages all of the compute resources required for your function and also provides built-in logging and monitoring through CloudWatch.

Getting Started with AWS Lambda and DynamoDB

Now let’s get our hands dirty with a simple project using AWS Lambda and DynamoDB – AWS’s in-house NoSQL database. DynamoDB will be the source of our trigger and Lambda will respond to those changes. We will use node.js to write our function.
Here’s how it will work: if there is any change in a specified DynamoDB table, it should trigger a function that will print the event details. Let’s take it step-by-step:

1. Create a Lambda Service

  • Login to the AWS console
  • Click on Lambda
AWS Lambda
  • You will be asked to select a blueprint. Blueprints are sample configurations of event sources and Lambda functions. You can ignore this by clicking on skip.
AWS Lambda

AWS Lambda

  • Provide Lambda with some basic details as shown below and paste the Node.js code that you want to be triggered automatically whenever a new item is added in dynamoDB. Also make sure the Role you select has all required permissions.

AWS Lambda

Note: The selected role should have following policy attached to it:

6
  • Verify the details in next screen and click Create Function.
  • Now, if you select the Lambda service you’ve created and click the Event Sources tab, there will be no records. But there should be an entry pointing to the source to which the Lambda function will respond. In our case its dynamo DB.

2. Create a DynamoDB table

Follow these steps to create a new Dynamo DB table:

  • Login to the AWS console.
  • Select DynamoDB.
  • Click Create Table and fill out the form that will appear as below:
7
  • Click Continue and, again, enter appropriate details into the form. Then click “Add Index to Table” as shown.
  • Once your index has been created, you can verify it under Table Indexes.
9
  • Clicking Continue will generate this screen:
10

We don’t need to make any changes, just click Continue. In the next screen, un-check the “Use Basic Alarms ”check box (assuming you don’t need any notifications).

11
  • Click Continue once again and you will see a verification screen. Verify that everything looks the way it should and click Create.

12
13

  • Now select your new table. Go to the Streams tab and associate it with the Lambda function that you created in Step 1.

Once your Lambda function is associated, you will see its entry in Event Sources tab of the Lambda service page.

  • Now go to your DynamoDB table and add a new item. In our example, we added an item with the ID “10” and the Name “My First Lambda service is up and running”. Once the item is added and saved, our Lambda service should trigger the function. This can be verified by viewing the Lambda logs. To do that, select the Lambda service and click on the Monitoring tab. Then click View Logs in Cloud Watch.
  • Select the Log Group and check the log.
17

The output will be something like this:

18
So you have successfully configured and executed the an AWS Lambda function! Now, your homework is to play around with generating other functions triggered by other sources.
Have any thoughts or comments? Join the discussion.

Written by

Working as a cloud professional for last 6 years in various organizations, I have experience in three of the most popular cloud platforms, AWS IaaS, Microsoft Azure and Pivotal Cloud Foundry PaaS platform.Having around 10 years of IT experience in various roles and I take great interest in learning and sharing my knowledge on newer technologies. Wore many hats as developer, lead, architect in cloud technologies implementation. During Leisure time I enjoy good soothing music, playing TT and sweating out in Gym. I believe sharing knowledge is my way to make this world a better place.

Related Posts

— November 28, 2018

Two New EC2 Instance Types Announced at AWS re:Invent 2018 – Monday Night Live

Let’s look at what benefits these two new EC2 instance types offer and how these two new instances could be of benefit to you. Both of the new instance types are built on the AWS Nitro System. The AWS Nitro System improves the performance of processing in virtualized environments by...

Read more
  • AWS
  • EC2
  • re:Invent 2018
— November 21, 2018

Google Cloud Certification: Preparation and Prerequisites

Google Cloud Platform (GCP) has evolved from being a niche player to a serious competitor to Amazon Web Services and Microsoft Azure. In 2018, research firm Gartner placed Google in the Leaders quadrant in its Magic Quadrant for Cloud Infrastructure as a Service for the first time. In t...

Read more
  • AWS
  • Azure
  • Google Cloud
Khash Nakhostin
— November 13, 2018

Understanding AWS VPC Egress Filtering Methods

Security in AWS is governed by a shared responsibility model where both vendor and subscriber have various operational responsibilities. AWS assumes responsibility for the underlying infrastructure, hardware, virtualization layer, facilities, and staff while the subscriber organization ...

Read more
  • Aviatrix
  • AWS
  • VPC
— November 10, 2018

S3 FTP: Build a Reliable and Inexpensive FTP Server Using Amazon’s S3

Is it possible to create an S3 FTP file backup/transfer solution, minimizing associated file storage and capacity planning administration headache?FTP (File Transfer Protocol) is a fast and convenient way to transfer large files over the Internet. You might, at some point, have conf...

Read more
  • Amazon S3
  • AWS
— October 18, 2018

Microservices Architecture: Advantages and Drawbacks

Microservices are a way of breaking large software projects into loosely coupled modules, which communicate with each other through simple Application Programming Interfaces (APIs).Microservices have become increasingly popular over the past few years. The modular architectural style,...

Read more
  • AWS
  • Microservices
— October 2, 2018

What Are Best Practices for Tagging AWS Resources?

There are many use cases for tags, but what are the best practices for tagging AWS resources? In order for your organization to effectively manage resources (and your monthly AWS bill), you need to implement and adopt a thoughtful tagging strategy that makes sense for your business. The...

Read more
  • AWS
  • cost optimization
— September 26, 2018

How to Optimize Amazon S3 Performance

Amazon S3 is the most common storage options for many organizations, being object storage it is used for a wide variety of data types, from the smallest objects to huge datasets. All in all, Amazon S3 is a great service to store a wide scope of data types in a highly available and resil...

Read more
  • Amazon S3
  • AWS
— September 18, 2018

How to Optimize Cloud Costs with Spot Instances: New on Cloud Academy

One of the main promises of cloud computing is access to nearly endless capacity. However, it doesn’t come cheap. With the introduction of Spot Instances for Amazon Web Services’ Elastic Compute Cloud (AWS EC2) in 2009, spot instances have been a way for major cloud providers to sell sp...

Read more
  • AWS
  • Azure
  • Google Cloud
— August 23, 2018

What are the Benefits of Machine Learning in the Cloud?

A Comparison of Machine Learning Services on AWS, Azure, and Google CloudArtificial intelligence and machine learning are steadily making their way into enterprise applications in areas such as customer support, fraud detection, and business intelligence. There is every reason to beli...

Read more
  • AWS
  • Azure
  • Google Cloud
  • Machine Learning
— August 17, 2018

How to Use AWS CLI

The AWS Command Line Interface (CLI) is for managing your AWS services from a terminal session on your own client, allowing you to control and configure multiple AWS services.So you’ve been using AWS for awhile and finally feel comfortable clicking your way through all the services....

Read more
  • AWS
Albert Qian
— August 9, 2018

AWS Summit Chicago: New AWS Features Announced

Thousands of cloud practitioners descended on Chicago’s McCormick Place West last week to hear the latest updates around Amazon Web Services (AWS). While a typical hot and humid summer made its presence known outside, attendees inside basked in the comfort of air conditioning to hone th...

Read more
  • AWS
  • AWS Summits
— August 8, 2018

From Monolith to Serverless – The Evolving Cloudscape of Compute

Containers can help fragment monoliths into logical, easier to use workloads. The AWS Summit New York was held on July 17 and Cloud Academy sponsored my trip to the event. As someone who covers enterprise cloud technologies and services, the recent Amazon Web Services event was an insig...

Read more
  • AWS
  • AWS Summits
  • Containers
  • DevOps
  • serverless