1. Home
  2. Training Library
  3. Amazon Web Services
  4. Courses
  5. Static Website Hosting, Storage, and Content Delivery on AWS

Setting up Amazon S3

The course is part of these learning paths

AWS Networking & Content Delivery
course-steps 7 certification 1 lab-steps 5
AWS Advanced Networking – Specialty Certification Preparation
course-steps 18 certification 1 lab-steps 8 quiz-steps 4
Operations on AWS
course-steps 6 certification 1 lab-steps 3

Contents

keyboard_tab
Outlining our plan
play-arrow
Start course
Overview
DifficultyBeginner
Duration18m
Students2482

Description

Even if many modern websites require complex server-side technologies to deliver dynamic content, many organizations still need pretty simple static websites just relying on HTML, CSS and Javascript. Nevertheless, even websites delivering static content need to scale and grant high availability, and a low latency, as the visits grow over time.

In this course, the expert CloudAcademy's Senior Devops Engineer, Antonio Angelino, will discuss how to set up a static website on the Cloud using only Amazon S3 to store the files, Amazon CloudFront for content delivery, Route 53 to associate a custom domain name to our website, and Amazon Glacier to set up an automatic backup strategy of the website's files on S3. It's an effective and low-cost solution that avoids the burden of configuring an EC2 instance with a webserver for a task that is very simple, but still needs good skills to be accomplished proficiently.

Who should follow this course

As a beginner-to-intermediate course, you are expected to have some experience with the basic concepts of website hosting. Also, you should have at least a small experience with the AWS services described in the course, namely S3, CloudFront, Route53 and Glacier. In any case, you should be able to understand the key concepts shown in this course even if you are a newcomer of the Amazon Cloud. 

If you need to learn more about the AWS services cited here, please check our collection of AWS courses. Also, if you want to test your knowledge on the basic topics covered in this course, check out our AWS questions. You will learn more about every single services cited in this course. 

If you have thoughts or suggestions for this course, please contact Cloud Academy at support@cloudacademy.com.

Transcript

The service setup will be done using the AWS management consul. So let's log in using our AWS credentials.

The first service that we'll set up is Amazon Simple Storage Service, Amazon S3. In order to store all the static content, all files are stored in an Amazon S3 bucket as objects and objects are stored in a location called a bucket. Bucket creation is free. You will only pay for storing and transferring in and out of your objects. Simple landing pages with CSS, Javascript and image files will cost no more than a few cents per month.

Each bucket can be configured to serve static content through standard http requests. This service is completely free.

In order to create our infrastructure, we have to create two different buckets. The first is the main bucket and will contain all our files and will be bound to the ourdomain.com URL while the second one will store access and error log files.

Open Amazon S3 dashboard. Click on create new bucket. Choose an available region and a bucket name. We must use our domain name as bucket name because we'll associate it to that bucket. So we'll use CloudAcademyLab.com during this course.

The last bucket we need is logs.cloudacademylab.com that will be used to store our website traffic logs.

Now, we're ready to set permissions on the main bucket so our visitors will be able to access the website files. Only bucket owners can access bucket objects by default so we need to change the control access bucket policies. To set access permissions on our main S3 bucket, we need to right click the main domain bucket, then click properties.

In the details pane, click permissions and add a new bucket policy. This JSON policy gives everyone permission to view any file in the main bucket. Click save in the bucket policy editor pop up and then click again save under the bucket permissions pane. In order to track the number of visitors accessing the website, we need to enable logging for the main domain bucket. You can skip this step if you don't want to log the website traffic.

In the bucket properties pane, click on logging, select the enabled check box and in the target bucket box, click the Amazon S3 bucket created to store log files.

The target prefix box is useful to choose where the log data will be stored. We used main/ to store logs in main directory.

Please remember not to store the bucket log files inside itself or it will generate log files when you open log files. Now that we've created and configured all Amazon S3 buckets, we're ready to upload website files. To test a previous S3 configuration, we can start by uploading two simple html files, index.html and error.html.

Now, let's upload them into our main bucket. In the Amazon S3 console, we need to select the main domain bucket. Click actions and then click upload.

In the upload select files dialog box, click add files. In the file upload dialog box, select both files and then click open. In the upload select files dialog box, click start upload. When both files are finished uploading, they'll be listed on the left.

Now, website test files are online so we have to configure Amazon S3 to serve those files as if they were hosted on a web server. In the main bucket details pane, click static website hosting, click enable website hosting.

In the index document box, type index.html. In the error document box, type error.html. Finally, let's save the new configuration.

Amazon S3 is now ready to host our static website. Let's check the configuration by navigating to the default end point URL assigned by Amazon web services.

About the Author

Students36776
Labs12
Courses4

Antonio is an IT Manager and a software and infrastructure Engineer with 15 years of experience in designing, implementing and deploying complex webapps.

He has a deep knowledge of the IEEE Software and Systems Engineering Standards and of several programming languages (Python, PHP, Java, Scala, JS).

Antonio has also been using and designing cloud infrastructures for five years, using both public and private cloud services (Amazon Web Services, Google Cloud Platform, Azure, Openstack and Vmware vSphere).

During his past working experiences, he designed and managed large web clusters, also developing a service orchestrator for providing automatic scaling, self-healing and a Disaster Recovery Strategy.

Antonio is currently the Labs Product Manager and a Senior DevOps Engineer at Cloud Academy; his main goal is providing the best learn-by-doing experience possible taking care of the Cloud Academy Labs platform.