Introduction to Amazon CloudWatch
Amazon CloudWatch Operations
Amazon EventBridge
AWS CloudTrail
AWS Config
AWS Logging
Searching Data
AWS Cost Management Tools
The course is part of this learning path
This section of the SysOps Administrator - Associate learning path introduces you to the different monitoring and reporting services and tools that are relevant to the SOA-C02 exam. We look at both the monitoring of your infrastructure, in addition to the reporting of your bills.
Learning Objectives
- Understand how Amazon CloudWatch is used to monitoring the performance of your infrastructure
- Learn how to identify anomalies in your infrastructure using Amazon CloudWatch
- Learn how Amazon EventBridge makes it easier to build event-driven applications at scale
- Learn about the different methods of logging that are available
- Understand how to review your costs and optimize them going forward
The AWS Cost and Usage Reports or in short the CUR. The CUR is basically the most important thing to capture your AWS billing data. And the CUR is a pretty complex CSV file that stores all details about your cost and usage data of all AWS resources.
Enabling the CUR is super important because it's the most granular and detailed mechanism to collect data for AWS costs and usage. It offers historical by-the-hour data that can offer clarity on trends and lead to a more accurate data-driven insight. And there's no looking back. Until the CUR is enabled, you're losing valuable data about your usage that is older than 12 months.
The CUR can get really big and in large corporations, it can easily get beyond five gigabyte and more with millions over millions of lines. So let's see how to enable them.
When we are here in the AWS Management Console, we click here on the top menu on the billing dashboard and you can see here on the left menu for the Cost and Usage Report. By default it's disabled so you need to enable it, enable it first and create a report. We give it a name. Let's call it test. I would advise you to include the resource IDs because then, every resource get a unique resource ID.
You can enable the automatic refresh. Click Next and then you need to choose an S3 bucket where you put the file. This can either be existing an S3 or a new one. Then we set a path, a prefix pass task costs. You can select the time granularity here. Of course, the more granular the data are, the more data you're going to produce.
We can create new report versions or we can override the existing ones and we can choose what kind of data integration we need.
And a little side note here maybe. You can export the files for Redshift or QuickSight usage. This will change the output format of the file to be readable for either Athena or Redshift and QuickSight. See, Athena is Parquet and Redshift or QuickSight is the CSV file that comes in a, in a zip file.
With Athena, Athena is a serverless service that allows you to analyze the data stored directly in Amazon S3 using standard SQL. And for that you need, as I just mentioned, the Parquet file While with Redshift and QuickSight, you can manage to see a SWI file as you would do it also like for example, for Excel.
Redshift is a so-called data warehouse service which you would use for querying big data sets with like multiple gigabytes or even up to petabytes. It can help you take wide insights of your own environment and for customers on a very large scale. And QuickSight is a business intelligence service that can combine data from literally any source into a dashboard. It helps you to visualize for any type of audience and it is much more visually driven than Athena or Redshift.
After you set up all these options here, you can click on Next and that's basically it. You have now configured your Cost and Usage Report. But be aware that it may take up to 24 hours for the first report to be delivered. Also, expect some costs from S3 for storing the CUR data in your S3 bucket. But these costs are like very, very low. Maybe like, I don't know, a few dollars per months or per year. Depends on how big your file gets of course.
Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data center and network infrastructure design, to cloud architecture and implementation.
To date, Stuart has created 150+ courses relating to Cloud reaching over 180,000 students, mostly within the AWS category and with a heavy focus on security and compliance.
Stuart is a member of the AWS Community Builders Program for his contributions towards AWS.
He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.
In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.
Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.