AWS Logging Mechanisms
This course is part 1 of a 2 part course series which focuses on a number of key AWS services and how they perform logging and monitoring across your environment. Being able to monitor data provides a number of key benefits to your organization, such as compliance, incident detection and resolution, trend analysis and much more! Collating data and statistics about your solutions running within AWS also provides the ability to optimize its performance. This series looks at how to implement, configure and deploy logging and monitoring mechanisms using the following AWS services and features
- Amazon CloudWatch - CloudWatch Monitoring Agent
- AWS CloudTrail Logs
- Monitoring CloudTrail Logs with CloudWatch Metric Filters
- Amazon S3 Access Logs
- Amazon CloudFront Access Logs
- VPC Flow Logs
- AWS Config Configuration History
- Filtering and searching data using Amazon Athena
The course for Part 2 can be found here
By the end of this course series you will be able to:
- Understand why and when you should enable logging of key services
- Configure logging to enhance incident resolution and security analysis
- Understand how to extract specific data from logging data sets
The content of this course is centered around security and compliance. As a result, this course is beneficial to those who are in the roles or their equivalent of:
- Cloud Security Engineers
- Cloud Security Architects
- Cloud Administrators
- Cloud Support & Operations
- Compliance Managers
This is an advanced level course series and so you should be familiar with the following services and understand their individual use case and feature sets.
- Amazon CloudWatch
- AWS CloudTrail
- Amazon EC2
- AWS Config
- Amazon S3
- EC2 Systems Manager (SSM)
This course includes
If you have thoughts or suggestions for this course, please contact Cloud Academy at firstname.lastname@example.org.
Hello, and welcome to this lecture, where I shall be looking at Amazon S3 access logs, what they are, and what they contain. As you may have guessed from the name, Amazon S3 access logs collate data based on who has been accessing a particular S3 bucket, and these logs record information, such as the source bucket that was accessed, a timestamp of the event, the identity requesting access to the object in the bucket, and the action that they were performing with the object.
By default, when you create a new bucket, access logging is not enabled. However, should you have a requirement to understand who is accessing your S3 buckets, then this logging can quickly and easily be enabled. The configuration of this process is based upon a source bucket and a target bucket. The source bucket is the bucket in which you want to log access requests for. The target bucket is the bucket in which the access logs will be delivered to. It's best practice to use different buckets for both the source and the target for ease of management. When configuring your buckets for logging, you need to be aware that the source and target buckets need to be in the same region.
To allow S3 to write logs to this target bucket, it will of course require specific permissions. These permissions allow write access for the Log Delivery group, which is a pre-defined Amazon S3 group, which is used to deliver log files to your target buckets. If the configuration of your access logging is configured using the management console, then the setup process automatically adds the Log Delivery group to the ACL of the target bucket, allowing the relevant access. However, if you were to configure the access logging using the command line, then you would need to manually configure these permissions.
To set up the access logs using the console is a very simple process. Firstly, you select the S3 bucket that you would like to capture access logs for, select the properties tab, select server access logging, choose Enable Logging. From the dropdown, select your target bucket, and this is the bucket in which the logs will be delivered and saved to. Enter a prefix for your log files if required, and click save. If you then look at the ACL permissions for that bucket, you'll notice that the log delivery group has automatically been given write access for that bucket. If you wanted to enable logging on the bucket programmatically, then you can do so using the S3 API or the AWS SDKs. When doing so, you need to configure the write access for the Log Delivery Group on the target bucket as an additional action. More information on how to perform these steps can be found using the following link.
When viewing your logs within the target bucket, you'll notice that each entry is made up of a number of different parameters. Let me show an example of a log access file to show you the data that makes up the entry.
Start of demonstration
Okay, so I've just opened up one of the S3 access logs that I have, just a very small snippet. And we can see at the top here, we've got a couple of entries, so let's just run through this top entry, just some of the key bits of information that make up the access log. This first entry here, and that's the canonical ID of the owner of the source bucket. Next we have the AWS bucket itself that was accessed along with the date and time as well. Next we have some information from the requester, which is their source IP address. This hyphen means that the user was unauthenticated. If it was a user within IAM then we'd see their user ID there. This set of characters is generated by AWS just as a unique ID for this request. Next we have the action that was carried out, which is a get object request against the object within the bucket which is this image file here. And again, we can see that here and also we have this response code of 200. Also we have some information here with regards to the amount of bytes sent and the object size, and also some timings in milliseconds as well for that request. We also have the referrer here of cloudacademy.com where the request initially came from, and then finally just some information regarding to the requester's application software. So that's a very quick summary and example of what an entry looks like within your AWS S3 access logs.
End of demonstration
Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data center and network infrastructure design, to cloud architecture and implementation.
To date, Stuart has created 150+ courses relating to Cloud reaching over 180,000 students, mostly within the AWS category and with a heavy focus on security and compliance.
Stuart is a member of the AWS Community Builders Program for his contributions towards AWS.
He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.
In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.
Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.