1. Home
  2. Training Library
  3. Implementing effective Cost Management solutions in AWS - level 3

Monitor for Underutilized Storage Resources

Contents

keyboard_tab
Cost Management KPIs
Cost Management: Tagging
3
Tagging
PREVIEW6m 51s
AWS Cost Allocation
6
Cost Allocation
PREVIEW9m 6s

The course is part of this learning path

Start course
Overview
Difficulty
Advanced
Duration
43m
Students
8
Description

This course covers the core learning objective to meet the requirements of the 'Implementing effective cost management solutions in AWS - Level 3' skill

Learning Objectives:

  • Apply a cost allocation tag strategy that allows AWS resources to map to business units
  • Create a way to plan AWS costs to revent them exceeding a budgeted amount
  • Evaluate a mechanism to monitor when underutilized AWS resources are present to optimize costs
Transcript

Right sizing compute typically gets all of the attention in cost optimization. However, it’s just as important to choose the right storage type and provision storage resources appropriately as well. Right sizing storage consists of monitoring for underutilized storage resources and then either modifying or deleting these resources to reduce costs.

The first way to monitor for underutilized or idle storage resources is to use Amazon CloudWatch. CloudWatch provides out-of-the-box metrics for storage services like Amazon EBS, Amazon S3, database services like DynamoDB and more.

For example, with Amazon EBS, you can use the Volume Idle Time metric. This metric specifies the number of seconds where no read or write operations were performed against the volume within a given time period. For this, you’ll want to compare the returned idle time against the time period that's specified. So if the time period is 1 minute and the volumeIdleTime metric is consistently around 60 seconds, that means for that entire interval, the volume was idle. 

If you have an idle EBS volume, you can then choose to snapshot the volume and terminate it if you no longer need it. You can additionally use Amazon Data Lifecycle Manager to manage the lifecycle of your EBS snapshots and delete them. 

You can also use custom metrics in CloudWatch to track EBS utilization. For example, if you wanted to track the amount of provisioned volume capacity for a specific EBS volume, you can create a custom CloudWatch metric to track that information. This way you can find volumes that are underprovisioned, as well as volumes that are overprovisioned.

Keep in mind this is just one way. An easier, more automated way to track for under-provisioned and over-provisioned EBS volumes is to use AWS Compute Optimizer and AWS Trusted Advisor. AWS Compute Optimizer will make throughput and IOPS recommendations for General Purpose SSD volumes and only IOPs recommendations for Provisioned IOPs volumes. However, it will identify a list of optimal EBS volume configurations that provide cost savings and potentially better performance. With Trusted Advisor, you can identify a list of underutilized EBS volumes. It also ingests data from AWS Compute Optimizer to identify volumes that may be over-provisioned as well. 

For Amazon S3, you can use the metric bucketsizebytes. This will give you the size of your bucket in bytes. This comes in handy if you have stray S3 buckets that aren’t holding much data. Using this metric, you can find and clean up those buckets very quickly. 

Additionally, with Amazon S3 you can use S3 object access logs. These will help you track requests that are made to your bucket.  Using this, you can find buckets that aren’t accessed frequently, and then determine if you still need the data in that bucket, or if you can move it to a lower cost storage tier or delete it. This can be a manual process of determining access patterns, so you may decide you’re more interested in a service that will help you do this instead. And there is one - called S3 Analytics.

You can use S3 Analytics to help you determine when to transition data to a different storage class. Using the analytics provided by this service, you can then use S3 lifecycle configurations to move data to lower cost storage tiers or delete it, ultimately reducing your spend over time. You can also optionally use the S3 Intelligent-tiering class to analyze when to move your data and automate the movement of the data for you. This is best for data that has unpredictable storage patterns. 

So, you can use CloudWatch metrics and logs to track storage utilization and access, or you can also use dedicated services such as AWS Compute Optimizer and Trusted Advisor. For S3, you can use S3 Analytics or the S3 Intelligent-Tiering Storage class to help monitor for data that's infrequently accessed. 

About the Author

Alana Layton is an experienced technical trainer, technical content developer, and cloud engineer living out of Seattle, Washington. Her career has included teaching about AWS all over the world, creating AWS content that is fun, and working in consulting. She currently holds six AWS certifications. Outside of Cloud Academy, you can find her testing her knowledge in bar trivia, reading, or training for a marathon.