Amazon S3 Security: master S3 bucket polices and ACLs

Learn about Bucket Policies and ways of  implementing Access Control Lists (ACLs) to restrict/open your Amazon S3 buckets and objects to the Public and other AWS users.

Follow along and learn ways of ensuring the public only access for your S3 Bucket Origin via a valid CloudFront request.

Welcome to part 8 of my AWS Security Series. This week I shall be looking at some of the security features around the Simple Storage Service (S3). In particular, Bucket Policies and how you can implement Access Control Lists (ACLs) to restrict or open up your S3 buckets and objects to the Public and other AWS users. I will also cover how you can ensure the public only access your S3 Bucket Origin via a valid CloudFront request, ensuring CloudFront is not bypassed resulting in unauthorised access.

S3 Security

If you are looking to implement security on S3 then you would already be familiar with what the service is and its reliability as a Storage service and the benefits it can bring. With this in mind you are probably storing a lot of data on this service and as a result you will want to ensure that it’s safe and secure! I will run through some of the security elements of S3 that you can choose to deploy, depending on your data’s sensitivity.

Bucket Policies

Bucket Policies are similar to IAM policies in that they allow access to resources via a JSON script. However, Bucket policies are applied to Buckets in S3, where as IAM policies are assigned to user/groups/roles and are used to govern access to any AWS resource through the IAM service.
When a bucket policy is applied the permissions assigned apply to all objects within the Bucket. The policy will specify which ‘principles’ (users) are allowed to access which resources. The use of Principles within a Bucket policy differs from IAM policies, Principles within IAM policies are defined by who is associated to that policy via the user and group element. As Bucket policies are assigned to Buckets, there is this need of an additional requirement of ‘Principles’.

Example Policy

Picture1
As shown above and if you read my previous article ‘Creating an AWS IAM Policy’, the syntax is very similar to IAM Policies. As mentioned previously though, there is the addition of the ‘Principal’ section. In the example above the principal is listed as user ‘CloudAcademy1’ via the users ARN which can be found in IAM.
The example policy allows ‘CloudAcademy1’ access to Delete Objects and Put Objects within the ‘cloud-academy’ Bucket.

Setting Bucket Policy Conditions

Again similarly to IAM Policies, S3 Bucket Policies allow you to set conditions with the Policy, for example allowing specific IP subnets to access the Bucket and perhaps restricting a specific IP address. The example below shows how to implement such conditions.
Picture2
In the ‘Condition’ section above you can see that the Subnet of 192.168.1.0/24 is allowed to access the bucket, however the IP address from this range of 192.168.1.10 is not allowed via the NotIpAddress condition.
For a full list of conditions and help on creating your S3 Bucket Policies take a look at the great tool that AWS provides here: AWS Policy Generator
An explicit deny within the policy will always take precedence over an ‘allow’. Access of least privilege will always over-rule where conflicts between policies exist. This will also be the case if you have an IAM user with S3 access to a specific bucket, which also happens to have a Bucket Policy. AWS will look at both policies and apply access on a least-privilege condition if there are conflicting permissions.
For more information on the syntax of policies and how to create and write your own, please visit my previous article here that will explain to how to create these.

S3 Access Control Lists

In addition to IAM Policies and Bucket Policies, S3 also has an additional method of granting access to specific objects through the use of Access Control Lists (ACLs), allowing a more finely grained access approach than a Bucket Policy. ACLs allow you to set certain permissions on each individual object within a specific Bucket. Again, access will always be granted on a least privileged condition if conflicts exist between ACLs, Bucket Polices and IAM Policies.
ACLs can be managed and configured from within the S3 Service itself or via APIs.

To modify Bucket ACL permissions within S3 within the Console

• Open the AWS console and select the S3 Service
• Navigate to the bucket you want to modify permissions on at an ACL level
• Select the ‘Property’ tab and then ‘Permissions’
Picture3
The permissions set here act as the ACL of the Bucket. You will notice a Grantee for the Bucket, which is the resource owner and is likely to have Full Control over that object and on now Bucket creation this is typically the AWS Account owner.
Other permissions that can be set are List (Read), Upload/Create (Write), View Permissions and Edit Permissions. If all checkboxes are selected, that Grantee is considered to have Full Control of the object.

  • You can either modify the current ACL displayed for your Bucket by selecting/deselecting the tick boxes as required, or you can add addition access by selecting ‘Add more permissions’
  • This will generate an additional line for a new Grantee. The Grantee options are as follows:
    o Everyone – This will allow access to this object by anyone, and that doesn’t just mean any AWS users, but anyone with access to the Internet
    o Authenticated AWS Users – This option will only allow IAM users or other AWS Accounts to access the object via a signed request
    o Log Delivery – This allows logs to be written to the Bucket when it is being used to store server access logs
    o Me – This relates to your current IAM AWS User Account
  • Select the appropriate Grantee and apply the permissions required using the tick boxes and then click ‘Save’
  • Your Bucket ACL has now been updated

It is worth mentioning that an S3 ACL can have up to 100 Grantees.
There are slightly different permission options between a Bucket ACL and an Object ACL as shown below.

To modify Object ACL permissions within S3 within the Console

  • Open the AWS console and select the S3 Service
  • Navigate to the object you want to modify permissions on at an ACL level
  • Select the ‘Property’ tab and then ‘Permissions’

Picture4
You will notice a small difference between the permissions available at a Bucket level to the permissions available at the Object level. Here you have the option to Open/Download the object, View Permissions and Edit Permissions.

  • You can either modify the current ACL displayed for your Bucket by selecting/deselecting the tick boxes as required, or you can add addition access by selecting ‘Add more permissions’
  • The example ACL pictured above only allows the Grantee, which happens to be the AWS Account owner, access to this file. As a result if I try to view this file via a browser using the URL highlighted within the example above I get the following “Access Denied” message:
Picture5
  • To allow access I need to modify the ACL to allow the Grantee of ‘Everyone’ and select the Open/Download permissions

Picture6
This new access will allow anyone to access the file via the URL, so when I try again now I am able to download the *.rtf file successfully and access the object. Allowing this access enables you to make your Objects accessible to the public via the assigned URL.

Using S3 as an Origin for CloudFront (Content Delivery Network – CDN)

This section of my article assumes you already have knowledge of CloudFront and its features. However, I just want to cover a point on how to implement an additional security point if you are not already doing so for your objects when using an S3 Bucket as your Origin.
When you use S3 as your Origin for CloudFront everyone has Read permission for the objects in your bucket allowing anyone to access the content via the CDN. However, if anyone or an application has the unique URLs to the objects then this will bypass the features offered by CloudFront such as access times of that object and IP restrictions, this maybe a security concern that you want to alleviate.
To ensure that no-one can access your Origin Bucket unless they are going via your CDN then you can create an Origin Access Identity (OAI) and associate that with your CloudFront Distribution. You can then allow ONLY the OAI access to your Bucket using methods already discussed in this article and remove all other user access. This way, only users utilising CloudFront will be able to access the Origin Bucket and anyone with a direct URL link will be denied as only the OAI will have access.
Creating an Origin Access Identity and assigning it to your Distribution

  • Log onto your AWS Console and select the CloudFront Service
  • Click ‘Create Distribution’
  • Select either your Web or RTMP Distribution and ‘Get Started’
  • In the ‘Origin Setting’ section we are interested in the information with the red box below
Picture7
  • Under ‘Restrict Bucket Access’ select ‘Yes’. This will then allow you to configure the ‘Origin Access Identity’
  • If you do not already have an OAI then select ‘Create a new identity’
  • If you want the CloudFront distribution to automatically update the permissions on your Origin Doman Name (S3 Bucket) then under ‘Grant Read Permissions on Bucket’ select ‘Yes, Update Bucket Policy’ – Note, this will allow the correct permissions for your OAI to access your Origin Bucket, however it will not remove other user access that may previously exist. You will need to go into your Polices etc to ensure any access is removed for other users that you do not want to have access
  • Continue to deploy your Distribution as you normally would

What are the key elements we have learned this week?

  • There are a number of different options available to restrict access to S3 objects, including:
    o IAM Policies
    o Bucket Policies
    o Access Control Lists
  • The difference between the above access options
  • We understood that objects will take on the access of the least privileges when conflictions conditions occur between different policies
  • We looked at how you can use an Object Access Identity to force users to access your Content via CloudFront ensuring additional features are implemented

Next week I shall be covering additional elements of the Simple Storage Service such as lifecycle policies and versioning and how they can be used to minimise data loss. There is also a feature that helps with accidental deletion that will also be covered. Lastly I will cover elements of encryption with S3 from a Client and Server side perspective.
There is a recent post that that details the pros and cons of using Amazon S3 vs Amazon Glacier: which AWS storage tool should you use?
For a reference on AWS IAM Security Nitheesh Poojary has another strong article published in August. Worth a read IMHO.
Thank you for taking the time to read my article. If you have any feedback please leave a comment below.

Written by

Stuart is the AWS content lead at Cloud Academy where he has created over 40 courses reaching tens of thousands of students. His content focuses heavily on cloud security and compliance, specifically on how to implement and configure AWS services to protect, monitor and secure customer data and their AWS environment.

Related Posts

— September 26, 2018

How to Optimize Amazon S3 Performance

Amazon S3 is the most common storage options for many organizations, being object storage it is used for a wide variety of data types, from the smallest objects to huge datasets. All in all, Amazon S3 is a great service to store a wide scope of data types in a highly available and resil...

Read more
  • Amazon S3
  • AWS
— February 13, 2018

Cloud Academy Sketches: Encryption in S3

Some of 2017’s largest data breaches involved unprotected Amazon Simple Storage (S3) buckets that left millions of customer data records exposed to the public. The problem wasn’t the technology, but administrators who improperly configured the security settings.For cloud teams in char...

Read more
  • Amazon S3
  • AWS
— January 3, 2018

How to Diagnose Cancer with Amazon Machine Learning

A common question in the medical field is:Is it possible to distinguish one class of samples from another, based on some set of measurements?Research investigating this and related medical questions have spurred innovation in medicine and the application of statistical methods and m...

Read more
  • Amazon S3
  • AWS
— November 30, 2017

AWS re:Invent 2017 Day 3. Amazon Rekognition Video Enables Object and Face Recognition

From the 22 new features released by AWS today at re:invent 2017, Amazon Rekognition Video stood out to me as the interesting “quiet achiever” I want to tell you about.Amazon Rekognition Video brings object and facial recognition to live and on-demand video content. With this innovati...

Read more
  • Amazon S3
  • AWS
  • reInvent17
— August 10, 2017

Using Amazon Athena to query S3 data for CloudTrail logs

Who is Athena again? Athena is the Greek goddess of wisdom, craft, and war. (But at least she had a calm temperament, and only fought for a just cause!) This post is about Amazon Athena and about using Amazon Athena to query S3 data for CloudTrail logs, however, and I trust it will brin...

Read more
  • Amazon Athena
  • Amazon S3
  • AWS
  • CloudTrail
— April 7, 2016

A Crash Course in Amazon Serverless Architecture: Discover the Power of Amazon API Gateway, Lambda, CloudFront, and S3

New expanded content showing all three AWS Serverless posts in one article. This is a detailed look at the components of AWS Serverless Architecture and how anyone can make the most of it. Because of the complexity of the subject, this post has been subdivided into 3 sections, each with...

Read more
  • Amazon S3
  • AWS
— September 11, 2015

Riak CS: a cloud storage solution compatible with Amazon S3

Riak CS is an open source cloud storage technology compatible with Amazon S3 and Openstack Swift. Discover why more and more companies are using it.Riak CS may not be the best known cloud storage technology right now, but it's definitely worthy of our attention. This post isn't meant ...

Read more
  • Amazon S3
  • AWS
— June 11, 2015

S3 FTP: Build a Reliable and Inexpensive FTP Server Using Amazon’s S3

Could you create an S3 FTP file backup/transfer solution without the normal administration headache on top of Amazon's Simple Storage Service?FTP (File Transfer Protocol) is a fast and convenient way to transfer large files over the Internet. You might, at some point, have configured ...

Read more
  • Amazon S3
  • AWS
— June 10, 2015

VPC Endpoint for Amazon S3: simple connectivity from AWS

Lets discuss VPC Endpoint's value, common use cases, and how to get it up and running with the AWS CLI.Last month Amazon Web Services introduced VPC Endpoint for Amazon S3. In this article I am going to explain exactly what this means, how it will change - and improve - the way AWS re...

Read more
  • Amazon S3
  • AWS
— February 17, 2015

Amazon S3 vs Amazon Glacier: A Simple Backup Strategy In The Cloud

Amazon S3 vs Amazon Glacier: which AWS storage tool should you use?When you set out to design your first AWS (Amazon Web Services) hosted application, you will need to consider the possibility of data loss.While you may have designed a highly resilient and durable solution, this won...

Read more
  • Amazon S3
  • AWS
— September 9, 2014

New lab: Create your first Amazon S3 bucket

One of the most amazing things I see here in CloudAcademy is the number of feedback we get from our members, who send lots of emails daily to tell us how good CloudAcademy.com is for them to learn Cloud, what we should improve, and what new content they would like to see soon. In fact, ...

Read more
  • Amazon S3
  • AWS
— March 28, 2014

Amazon S3 and Amazon Glacier together: the best of both worlds for your backup strategy

When it comes to backing up your data, you just want the safest, fastest, easiest and cheapest solution available, don't you? Unfortunately, a compromise must be made among those 4 desiderata, especially with regard to the price, which is likely to be much higher when safety, speed and ...

Read more
  • Amazon S3
  • AWS