The course is part of these learning paths
Using Serverless Functions
In this course, we explore some of the common use cases for serverless functions and get started with implementing serverless functions in a simple application
- [Instructor] Hello, and welcome to course four in our Getting Started with Serverless Computing Learning Path. In this course, we're going to extend our knowledge of serverless computing by creating a Lambda function to monitor and execute a change to a file we upload to an AWS S3 bucket. One useful use case for serverless functions is monitoring applications and files. This is where serverless functions can be really powerful, as it is easy to create event triggers from another service based on events or changes. So a common use case is converting files from one format to another. Now, this is quite often an involved process. Now, life is so much simpler with Lambda. We can create a simple trigger on an S3 bucket to kick off a processing task automatically, and this is what we're going to do with our new function, so to keep things simple, we're going to compress a file and save it as a separate .zip file when a file is uploaded to our S3 bucket. So, to step this out in some more detail, we're going to watch an AWS S3 bucket, and when a new file is uploaded, we're gonna compress it using some code that we'll write, and then we'll upload that compressed file back to the same S3 bucket with a different filename. So it's quite simple, but you can hopefully see how you could use this to do more complex functions. Right. So first, let's create an Amazon S3 bucket that we can use for our file repository. Okay, we're going to use this as a starting point, so let's go back into Lambda, and let's create a new Lambda function. We want Lambda to monitor our S3 event. We just need to select the right bucket. We want to monitor for a new file, let's say put action. We want to listen only in this images folder. So we can use our prefix. Let's say we want to only listen for .png images. We just enable a trigger, and for our function code, we're going to use Python 2.7. Now, we're going to use our own code block. We want to compress the file and save it as a new filename. Here I have just added some code to get us started. We're just retrieving the data, compressing the body, retrieving the body, and then pushing a new file into S3. Okay? So let's go down a few lines to where the action starts. What we want is to trigger the conversion event when we upload a file into our folder. So we're talking the filename and creating a compressed file from that uploaded file. If the function is working correctly, it will recognize the change, and then convert the file to create a compressed .zip file in a new .zip directory, so let's save this and go into the AWS Lambda configuration. It's important to remember that Lambda is running this function for us, and doing that, it is taking away all of those steps like security groups, network access control lists, and running this function in a highly available way, so therefore it's taking care of subnets, availability zones, and all that scalability that we generally have to configure. So with all this work done by Lambda, there are some things that need to be configured by us, and permissions is one of those things. For this use case, we're going to both reading and writing in and from Amazon S3. So, if you don't do anything, the console would just use a basic read role. So let's try it out and see what happens. Let's test it out, and that will help us understand how this works. Now, we can configure a test event. You can select a sample event from S3 for example, like we'll use S3 Put. We can very quickly change the packets so the test event is already updated. Now you can see, if we save and test, we will get an error saying AccessDenied. Now, let's upgrade the user role to one that allows both read and write access to Amazon S3. This is quite a high privilege role. In other situations, this would probably be something we'd consider first before enabling. With Lambda, we have some assurance that the role will only be able to run in this mode for a limited time, i.e. the runtime of the function, so that provides some assurance in this instance. We are providing local access to S3. Although it would make sense to constrain this access to our specific bucket, a Lambda function will now be able to get and put an object into S3. Great. When we try it again, it works. If everything works right, I should be able to go to my bucket, upload a new .png file, and our Lambda function should recognize the change, process and compress the file, and save it out as a new zipped version of that file to our new .zip folder. Okay. Now we have a new .zip file. On the face of it, our code was working okay. So this is just the tip of the iceberg of what you can do with AWS Lambda. The service triggers make integration really easy, and the code support for popular languages such as Python, Java, and C# make it really easy to get started with manage functions. It's a minor consideration, but it's worth keeping in mind that you can use other code run times, such as PHP, Go, et cetera. But you would need to call the libraries first in that instance where we're not running it natively. However, the good news is, you can support multiple code bases, so you get all the benefits of a managed service over machines that you have to manage. Okay, so that brings to a close our Amazon S3 processing lecture. Make sure you try the functions out for yourself in the following labs, and getting your hands dirty is the best way to learn how to get the most from serverless functions.
About the Author
Andrew is an AWS certified professional who is passionate about helping others learn how to use and gain benefit from AWS technologies. Andrew has worked for AWS and for AWS technology partners Ooyala and Adobe. His favorite Amazon leadership principle is "Customer Obsession" as everything AWS starts with the customer. Passions around work are cycling and surfing, and having a laugh about the lessons learnt trying to launch two daughters and a few start ups.