Course Introduction
Cost Management
Improve Planning and Cost Control with AWS Budgets
AWS Cost Management: Tagging
Understanding & Optimizing Storage Costs with AWS Storage Services
Monitoring for underutilized services in AWS
Using Instance Scheduler to Optimize Resource Cost
The course is part of this learning path
This section of the AWS Certified Solutions Architect - Professional learning path introduces you to cost management concepts and services relevant to the SAP-C02 exam. By the end of this section, you will know how to select and apply AWS services to optimize cost in scenarios relevant to the AWS Certified Solutions Architect - Professional exam.
Want more? Try a Lab Playground or do a Lab Challenge!
Learning Objectives
- Learn how to improve planning and cost control with AWS Budgets
- Understand how to optimize storage costs
- Discover AWS services that allow you to monitor for underutilized resources
- Learn how the AWS Instance Scheduler may be used to optimize resource costs
Right sizing compute typically gets all of the attention in cost optimization. However, it’s just as important to choose the right storage type and provision storage resources appropriately as well. Right sizing storage consists of monitoring for underutilized storage resources and then either modifying or deleting these resources to reduce costs.
The first way to monitor for underutilized or idle storage resources is to use Amazon CloudWatch. CloudWatch provides out-of-the-box metrics for storage services like Amazon EBS, Amazon S3, database services like DynamoDB and more.
For example, with Amazon EBS, you can use the Volume Idle Time metric. This metric specifies the number of seconds where no read or write operations were performed against the volume within a given time period. For this, you’ll want to compare the returned idle time against the time period that's specified. So if the time period is 1 minute and the volumeIdleTime metric is consistently around 60 seconds, that means for that entire interval, the volume was idle.
If you have an idle EBS volume, you can then choose to snapshot the volume and terminate it if you no longer need it. You can additionally use Amazon Data Lifecycle Manager to manage the lifecycle of your EBS snapshots and delete them.
You can also use custom metrics in CloudWatch to track EBS utilization. For example, if you wanted to track the amount of provisioned volume capacity for a specific EBS volume, you can create a custom CloudWatch metric to track that information. This way you can find volumes that are underprovisioned, as well as volumes that are overprovisioned.
Keep in mind this is just one way. An easier, more automated way to track for under-provisioned and over-provisioned EBS volumes is to use AWS Compute Optimizer and AWS Trusted Advisor. AWS Compute Optimizer will make throughput and IOPS recommendations for General Purpose SSD volumes and only IOPs recommendations for Provisioned IOPs volumes. However, it will identify a list of optimal EBS volume configurations that provide cost savings and potentially better performance. With Trusted Advisor, you can identify a list of underutilized EBS volumes. It also ingests data from AWS Compute Optimizer to identify volumes that may be over-provisioned as well.
For Amazon S3, you can use the metric bucketsizebytes. This will give you the size of your bucket in bytes. This comes in handy if you have stray S3 buckets that aren’t holding much data. Using this metric, you can find and clean up those buckets very quickly.
Additionally, with Amazon S3 you can use S3 object access logs. These will help you track requests that are made to your bucket. Using this, you can find buckets that aren’t accessed frequently, and then determine if you still need the data in that bucket, or if you can move it to a lower cost storage tier or delete it. This can be a manual process of determining access patterns, so you may decide you’re more interested in a service that will help you do this instead. And there is one - called S3 Analytics.
You can use S3 Analytics to help you determine when to transition data to a different storage class. Using the analytics provided by this service, you can then use S3 lifecycle configurations to move data to lower cost storage tiers or delete it, ultimately reducing your spend over time. You can also optionally use the S3 Intelligent-tiering class to analyze when to move your data and automate the movement of the data for you. This is best for data that has unpredictable storage patterns.
So, you can use CloudWatch metrics and logs to track storage utilization and access, or you can also use dedicated services such as AWS Compute Optimizer and Trusted Advisor. For S3, you can use S3 Analytics or the S3 Intelligent-Tiering Storage class to help monitor for data that's infrequently accessed.
Danny has over 20 years of IT experience as a software developer, cloud engineer, and technical trainer. After attending a conference on cloud computing in 2009, he knew he wanted to build his career around what was still a very new, emerging technology at the time — and share this transformational knowledge with others. He has spoken to IT professional audiences at local, regional, and national user groups and conferences. He has delivered in-person classroom and virtual training, interactive webinars, and authored video training courses covering many different technologies, including Amazon Web Services. He currently has six active AWS certifications, including certifications at the Professional and Specialty level.