Authentication, Authorization & Accounting
The course is part of these learning pathsSee 3 more
Cloud Security is a huge topic, mainly because it has so many different areas of focus. This course focuses on three areas that are fundamental, AWS Authentication, Authorisation and Accounting.
These three topics can all be linked together and having an understanding of the different security controls from an authentication and authorization perspective can help you design the correct level of security for your infrastructure. Once an identity has been authenticated and is authorised to perform specific functions it's then important that this access can be tracked with regards to usage and resource consumption so that it can be audited, accounted and billed for.
The course will define and discuss each area, and iron out any confusions of meaning between various security terms. Some people are unaware of the differences between authentication, authorization and access control, this course will clearly explain the differences here allowing you to use the correct terms to describe your security solutions.
From an AWS authentication perspective, a number of different mechanisms are explained, such as Multi-Factor AWS Authentication (MFA), Federated Identity, Access Keys and Key Pairs. With the help of demonstrations, you can learn how to apply access keys to your AWS CLI for programmatic access and understand the differences between Linux and Windows authentication methods using AWS Key Pairs.
When we dive into understanding authorization we cover IAM Users, Groups, Roles and Policies, providing examples and demonstrations. Within this section, S3 authorization is also discussed, looking at access control lists (ACLs) and Bucket Policies. Moving on from S3, we look at network and instance level authorization with the help of Network Access Control Lists (NACLs) and Security Groups.
Finally, the Accounting section will guide you through the areas of Billing & Cost Management that you can use to help identify potential security threats. In addition to this, we explain how AWS CloudTrail can be used to track API calls to analyse what users are doing and when. This makes CloudTrail a strong tool in tracking, identifying and monitoring a user's actions within your AWS environment.
Hello, and welcome to this lecture on AWS accounting.
Firstly, I want to clarify what this topic covers, and what I mean by accounting. By accounting, I refer to tracking what actions a user performs during their session while authenticated to the AWS environment. This information can then be used to determine the costs incurred by that identity. This can include information such as resources used, the amount of data transferred in and out of AWS, and any other actions with cost implications. This information can all be used to assist with billing, planning, and ensuring resource capacity and utilization is optimized.
Let's start by understanding the billing process, and how this is calculated for an AWS account. When it comes to monitoring and managing your AWS financial information, you will want to ensure this privilege is restricted to those only authorized to do so. By default, your AWS account owner has full access to all billing information. No other users initially have access. The first time that your root account tries to access the billing information, you will need to perform three steps, as it's not automatically activated, and isn't initially visible. Firstly, log into the AWS Management Console as the AWS account owner, or root. Select your account name in the top right, and select My Account. Select Edit, next to IAM User Access to Billing Information, and select the checkbox to activate. The root account can now access the Billing and Cost Management form within the console by selecting your user name and account in the top right corner, and selecting Billing and Cost Management. From here, you are able to access information such as your bills, budgets, reports, payment history, and method of payment, etc. Clearly this is very sensitive information. If you require other users to access billing information, you have to grant specific permissions via an IAM policy.
Let's take a high-level overview at some of the most common topics within Billing and Cost Management, these being Bills, Cost Explorer, Budgets, Reports, Cost Allocation Tags, and Consolidated Billing.
If we look at the Bills section, we will see precisely what we'd expect: AWS bills for the account. From here we can see our current, monthly expenditure for AWS, using the exchange rate valid at that time to your local currency. We also have the ability to drill down into where these costs have been incurred from a service level. We can also navigate back in history to see your previous month's bills, if you need to see them. If we wanted to interact with this billing data, then we can download the entire bill as a CSV file to allow us to do so, for example, from within Microsoft Excel.
Moving on to Cost Explorer, this is a useful and powerful tool within Billing and Cost Management. It allows you to view historical billing information in a graphical format, giving you greater insight into your AWS spend, a valuable tool that can help to identify where you should be focusing your cost optimization efforts. It also has the ability to forecast your estimated spend up to two months ahead, using existing data as a reference. If you can see that your estimated future bills are becoming too high, you will have the time to identify where you can make and initiate cost reduction mechanisms to help mitigate the risk.
Cost Explorer comes configured with three pre-defined views, which are commonly used to analyze spending across your account. Monthly Spend by Service View: this covers the current and previous two months, and is grouped by AWS services. Monthly Spend by Linked Accounts View: this covers the current and previous two months, as is grouped by linked accounts. Daily Spend View: this covers the daily spend over the previous 60 days. When viewing these graphs and charts, you can easily notice unexpected peaks of service usage that are out of trend, which could highlight possible misuse of services within your account. This could be caused by a whole host of security concerns, such as incorrect permissions set on users or other identities that have more authorization than they should have, creating large EC2 instances, for example.
Billing and Cost Management Budgets are, again, self-explanatory. Budgets can be created to help you manage costs and usage across your environment. You can configure budgets to alert you via the use of SNS, Simple Notification Service, when certain thresholds are met, allowing you to take effective action. Data for calculating budgets is taken from Cost Explorer, so when used together, you can create an effective solution for managing your expenditure. One point to make with AWS Budgets is that you can set up a new budget that isn't specifically cost-related. For example, you could set up a budget to measure usage within your services, such as how many gigabytes are used within a particular S3 bucket, or even how often a particular API is being called. This could potentially be used to aid with security. If there is a high restriction on a specific API function being called, then a low threshold could be set on the budget to monitor the number of calls, and once this threshold is met, an SNS message could be sent to the security team to identify where and why this API is being called.
Similarly to Budgets, Reports allow you to report on cost and usage. Reports can be custom-generated, and can produce extremely detailed information on your services, costs, and usage. These reports are then stored within an S3 bucket for you to review. You must ensure you set the correct level of permissions on this S3 bucket, ensuring only those requiring access can access it. Discussed previously in this course, this can be achieved by IAM policies, S3 bucket policies, or S3 ACLs. These detailed reports can be uploaded to either AWS Redshift or AWS QuickSight for further analysis by business intelligence applications to gain further insights into your usage and cost patterns.
A simple but very effective way of combining services that share a common characteristic, such as a project name or departmental function, can be grouped together by the use of tags. When you create EC2 instances, for example, you can associate a custom tag name, such as Sales or Engineering. These tags can then be selected under Cost Allocation Tagging in Billing and Cost Management to filter all costs associated to the AWS resources that share this tag name. Reports can be generated based off of those tags to accurately define costs associated to any of your custom-configured tags within your environment.
Lastly within this section on Billing and Cost Management, I want to talk about Consolidated Billing between linked AWS accounts. If you had multiple AWS accounts, possibly for different functions, customers, or even environments, then you can link them together to allow you to securely consolidate all your AWS billing via a single, master paying account.
It's important to point out that by linking accounts, only the billing information is linked. No other information can be accessed between the accounts. Each account still acts completely independent of the other.
By consolidating the billing into one Paying Account, simplifies your AWS finances. It provides a simple overview of all your accounts each month. Only the Paying Account is charged, and from here you can see the usage of all other accounts in one place through other features we have already discussed, such as Cost Explorer.
If you link your account for Consolidated Billing, I strongly suggest that you enforce strong security on the Paying Account, and implement Multi-Factor Authentication. So as we have highlighted, you are able to extract a lot of information from your AWS bill using the different features and tools. These tools and features offer you the ability to highlight and be alerted of potential security breaches, such as unauthorized EC2 launches. If you find such an anomaly that is costing your account more than expected, how can you find out who, or what, or responsible? One way is to take a look at AWS CloudTrail.
AWS CloudTrail tracks and monitors AWS API calls made within your environment, regardless of the method used. It can be initiated from other AWS services, a user from within the AWS Management Console, AWS CLI, or from an SDK. If an API call is made to create, modify, or delete an AWS resource, then AWS CloudTrail will log the request in an event, along with key data associated with the request. When an authenticated user is accessing resources on AWS, there are a lot of API calls being made. Most of these will be authorized and legitimate. However, some of them may be trying to penetrate your network and gain access to unauthorized data. A tool like CloudTrail is very effective in identifying valid and malicious API calls.
The details recorded within CloudTrail events contain the following information about the API call. The identity of the caller, the timestamp of when the request was initiated, the source IP address, the request parameters, the response elements returned by the AWS service. Having these details is great from a security point of view, as it provides very specific information about where an API call originated and which identity initiated the request. If malicious activity was found, then a number of security blocks could be implemented quickly to prevent the user from causing any more damage. For example, their account could be removed, and that IP address blocked at a NACL level.
Looking at your API call history within CloudTrail, you may need to find a particular event quickly, perhaps in a circumstance where you're trying to mitigate a security breach. To help you with this, you can enable a number of filters, which will help you locate and identify the data you need. These filters are time range, event name, user name, resource name, and resource type. Between the use of events recorded and the filtering possibilities, it's possible to track exactly what actions a user carried out during their session, how long that session lasted, and if they were successful or unsuccessful. This makes the CloudTrail service a great security analysis tool.
When events are created by CloudTrail, they are compiled into logs, which are then delivered and stored in an S3 bucket, much like Billing and Cost Management reports are. Again, be sure to apply the correct security level to this S3 bucket. By default, these log files are stored in S3 with Server Side Encryption, SSE, for additional security. The encryption is managed by the S3 service itself, and so when the log file is stored, S3 automatically encrypts it using its own keys, and when the log file is read by someone with authorization, S3 will use its own decryption keys to open the file. Your CloudTrail logs can hold very sensitive information, and as a result, this default SSE encryption is valuable. However, to tighten security even further, you can use SSE with KMS, which is the AWS Key Management Service, allowing you to use your own keys to encrypt the log files.
CloudTrail isn't only integrated with S3, it also has a close relationship with AWS CloudWatch, too. AWS CloudWatch is a monitoring service that allows you to view and track specific metrics from other AWS services. CloudWatch also allows you to create alarms if custom monitoring metric thresholds are reached, allowing your operations team to investigate potential issues. Logs from CloudTrail can be sent to CloudWatch to allow metrics and thresholds to be configured, which in turn can utilize SNS to create notifications of specific events relating to API activity.
CloudWatch allows for any event created by CloudTrail to be monitored. This enables a whole host of security monitoring checks to be created. A great example of this is to be notified when certain API calls requesting significant changes to be made to your security groups or NACLS within your VPC.
Other examples of these checks that are common within organizations are API calls relating to starting, stopping, rebooting, and terminating EC2 instances. If instances are being created that shouldn't be, your AWS costs could rise dramatically and quickly. Also, if instances are being rebooted or stopped, this could have a severe impact to your services if they are not configured in a high-availability and resilient solution.
Changes to security policies within IAM and S3. If changes are being made to your policies that shouldn't be, access can be inadvertently removed for authorized users, and access given to unauthorized users, having a massive impact on operational services. Even a minor change to a policy can pave the way for an untrusted user to exploit the error.
Failed login attempts to the Management Console. Monitoring failed attempts here can help to prevent unauthorized access at your environment's front door.
API calls that result in failed authorization. Not only does CloudTrail track successful API calls, whereby the correct authorization was met by the authenticated identity, but it also tracks unsuccessful API requests too, which would likely be due to permissions applied. Special attention should be applied to these unsuccessful attempts, as this could be a malicious user trying to gain access. However, it could also be a legitimate user trying to access a resource they should have access to for their role, but the incorrect permissions have been applied within their associated IAM policy.
When used effectively, you can clearly see that AWS CloudTrail can be a strong tool for tracking, identifying, and monitoring a user's actions within your AWS environment.
That brings us to the end of this lecture. Coming up next, we'll just have a quick summary on the course.
About the Author
Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data centre and network infrastructure design, to cloud architecture and implementation.
To date, Stuart has created 60++ courses relating to Cloud, most within the AWS category with a heavy focus on security and compliance
He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.
In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.
Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.