Creating a Lifecycle Configuration Demo

Contents

SAA-C03 Introduction
AWS Storage
2
Introduction to Amazon EFS
Amazon EC2
25
Amazon Elastic Block Store (EBS)
Optimizing Storage
29
31
AWS Backup
PREVIEW3m 50s
Running Operations with the Snow Family
Data Transfers with AWS DataSync
35
SAA-C03 Review
38
Storage Summary
PREVIEW9m 12s
Start course
Difficulty
Beginner
Duration
3h 30m
Students
3563
Ratings
4.8/5
starstarstarstarstar-half
Description

This section of the Solution Architect Associate learning path introduces you to the core storage concepts and services relevant to the SAA-C03 exam. We start with an introduction to the AWS storage services, understand the options available and learn how to select and apply AWS storage services to meet specific requirements. 

Want more? Try a lab playground or do a Lab Challenge

Learning Objectives

  • Obtain an in-depth understanding of Amazon S3 - Simple Storage Service
  • Get both a theoretical and practical understanding of EFS
  • Learn how to create an EFS file system, manage EFS security, and import data in EFS
  • Learn about EC2 storage and Elastic Block Store
  • Learn about the services available in AWS to optimize your storage
  • Learn how to use AWS DataSync to move data between storage systems and AWS storage services
Transcript

In this video, I’ll configure an S3 Lifecycle configuration using both the AWS Console and the AWS CLI. This S3 Lifecycle configuration will transition a subset of objects from an S3 bucket. 

In this S3 bucket, I have a combination of cat photos and beach photos. I currently access my cat photos all of the time. But for my beach photos, I find I only access them frequently the first month. After that period, I still want to keep them in storage for a year just in case anyone asks me about my vacation. However, no one ever asks, so I might as well store them in a low cost storage class after that first 30 days. And after a year, I’ll take a new vacation, so I can just delete these old beach photos. Using this information, I can create a lifecycle configuration based on my known access patterns. 

I’ll begin in the S3 console, and I’ll click on my bucket named lifecycle-configuration-demo-console. Inside my bucket, you can see my objects:  a wide range of cat photos, and then I have a folder called beach. If I click on this folder, I can then see my beach photos. 

To create a  lifecycle configuration to move my beach photos, I can click on the management tab, and under lifecycle rules, I can click “create lifecycle rule”. 

Here, I will enter the name, in this case I will call it BeachPhotosRule. Then I can choose to either limit it based on prefix or apply it to all photos. Since I want to leave my cat photos untouched, I will limit it based on a filter. And under prefix, I will type in my folder name beach/. 

Notice how I can choose to filter based on object tag, or minimum and maximum object size as well. However, for this demo, I’ll only be filtering by prefix. Moving down to the actions section, I can choose where my objects go. 

In this case, I don’t have versioning turned on with this bucket and I’m not using multipart uploads, so I only need to select the option to transition current versions of objects and the option to expire current versions of objects. 

To transition my beach photos, I can choose which storage class I’d like to move them to after 30 days. Since I’ll rarely be accessing my photos, an archival storage class is the best option. For this demo, I’ll choose Glacier Flexible Retrieval. And I’ll specify that this transition should happen 30 days after the object is created.  

It’s giving me a warning saying I will incur a per object fee to transition my objects to Glacier, as well as an additional object metadata storage charge, since the S3 service adds 32 KB of storage for object metadata that’s used when restoring objects. Even though it will add on additional cost, this is fine with me, so I will check the box and move on. 

Now I get to choose when to delete my objects. S3 Glacier Flexible Retrieval has a 90 day retention period, so as long as this period is longer than 90 days, I won’t incur an additional fee. Since I take my vacations yearly, I will choose 365 days to delete my photos. 

And that is all I need to set up. If I weren’t talking through all the options, this would take no time at all. And the best part, is it generates a timeline that I can review to ensure I made all the right choices before clicking “create rule”. 

Now, for those of you who don’t enjoy clicking around the AWS console: creating this using the CLI is just as easy. One thing about configuring lifecycle configurations using the CLI is that you have to provide a JSON template file defining your configuration. If you’re more comfortable or familiar with XML, you can always configure it in XML and use any one of the free XML to JSON converter websites to do the conversion for you. 

However, I have already created a JSON file, and I’ll run through it quickly. I have the ID, or name of the rule which is BeachPhotosRule. Then I have the filter, which uses the prefix of my folder beach/. The status is enabled. I have a transition to the S3 Glacier Flexible Retrieval storage class after 30 days. And I’m deleting my beach photos after 365 days. 

As you can see, it’s the same exact lifecycle configuration. Going to my terminal, I can then use the following command. 

And then dictate my file path for my lifecycle configuration, beginning with file:// and then the file name. And then I can specify which bucket to apply this to, using –bucket, and then my bucket name which, in this case is lifecycle-configuration-demo-cli

And it looks like it was successful. To verify, I can go back to the S3 console, find my demo-cli bucket. Click on the management tab, click on the lifecycle configuration, and verify that the details are correct. And that’s it for this one - see you next time! 

About the Author
Students
220305
Labs
1
Courses
213
Learning Paths
174

Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data center and network infrastructure design, to cloud architecture and implementation.

To date, Stuart has created 150+ courses relating to Cloud reaching over 180,000 students, mostly within the AWS category and with a heavy focus on security and compliance.

Stuart is a member of the AWS Community Builders Program for his contributions towards AWS.

He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.

In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.

Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.