Outlining our plan
Setting up the infrastructure
The course is part of these learning paths
In this course, the expert CloudAcademy's Senior Devops Engineer, Antonio Angelino, will discuss how to set up a static website on the Cloud using only Amazon S3 to store the files, Amazon CloudFront for content delivery, Route 53 to associate a custom domain name to our website, and Amazon Glacier to set up an automatic backup strategy of the website's files on S3. It's an effective and low-cost solution that avoids the burden of configuring an EC2 instance with a webserver for a task that is very simple, but still needs good skills to be accomplished proficiently.
Who should follow this course
As a beginner-to-intermediate course, you are expected to have some experience with the basic concepts of website hosting. Also, you should have at least a small experience with the AWS services described in the course, namely S3, CloudFront, Route53 and Glacier. In any case, you should be able to understand the key concepts shown in this course even if you are a newcomer of the Amazon Cloud.
If you need to learn more about the AWS services cited here, please check our collection of AWS courses. Also, if you want to test your knowledge on the basic topics covered in this course, check out our AWS questions. You will learn more about every single services cited in this course.
If you have thoughts or suggestions for this course, please contact Cloud Academy at firstname.lastname@example.org.
Even if you're hosting static files, they can change over time and making periodic backups could be a useful task. Amazon S3 allows you to store the multiple versions of your files by enabling versioning. If you edit your index.html file and upload it, you'll only see the latest uploaded object but there are two different objects with the same key and different version ID stored in your bucket. In order to enable versioning, we have to select the main bucket. Click on properties tab, open versioning pane, and click on enable versioning. If you frequently change your files and you don't need to store all the file changes, you can manage the object's life cycles.
Clicking on life cycle pane and then on add rule link, you can start adding new life cycle rules. You can create global rules or rules that will be applied only to a specific group of objects.
In this video, we'll create a global rule for the whole bucket. So select whole bucket radio button in step one and click configure rule. Life cycle rules can be applied to the current version of objects or to the previous ones. The allowed actions are archive only, permanently delete only, archive and then permanently delete and do nothing. Archive only will move the specified file to Glacier without removing it from S3. Permanently delete only will delete the specified object version, and archive and then permanently delete will do both actions. In our case, we are interested in archiving the old versions of our files and then removing them to Amazon S3 in order to lower our infrastructure TCO, or total cost of ownership.
So let's select the archive and then permanently delete action for previous versions from the drop down select box.
Now we have to decide when the old versions should be archived and when they should be permanently deleted from Glacier. We decided to move previous versions after 10 days and completely destroy them after 60 days.
Please remember that if you want to recover an archived version from Glacier, you have to wait up to five hours, so do not archive the object versions too early.
To finish the add rule procedure, click on the review button, choose a rule name and then click the create and activate rule button.
Right now all your bucket files are versioned and previous versions are archived using Glacier, the ultra secure durable and extremely low cost Amazon storage service.
Antonio is an IT Manager and a software and infrastructure Engineer with 15 years of experience in designing, implementing and deploying complex webapps.
He has a deep knowledge of the IEEE Software and Systems Engineering Standards and of several programming languages (Python, PHP, Java, Scala, JS).
Antonio has also been using and designing cloud infrastructures for five years, using both public and private cloud services (Amazon Web Services, Google Cloud Platform, Azure, Openstack and Vmware vSphere).
During his past working experiences, he designed and managed large web clusters, also developing a service orchestrator for providing automatic scaling, self-healing and a Disaster Recovery Strategy.
Antonio is currently the Labs Product Manager and a Senior DevOps Engineer at Cloud Academy; his main goal is providing the best learn-by-doing experience possible taking care of the Cloud Academy Labs platform.