Monitoring and compliance
Please note: this course has now been removed from our content library. If you want to study for the SysOps Administrator Associate Level Certification, we recommend you take our dedicated learning path for that certification that you can find here.
The AWS Certified SysOps Administrator (associate) certification requires its candidates to be comfortable deploying and managing full production operations on AWS. The certification demands familiarity with the whole range of Amazon cloud services, and the ability to choose from among them the most appropriate and cost-effective combination that best fits a given project.
In this exclusive Cloud Academy course, IT Solutions Specialist Eric Magalhães will guide you through an imaginary but realistic scenario that closely reflects many real-world pressures and demands. You'll learn to leverage Amazon's elasticity to effectively and reliably respond to a quickly changing business environment.
The SysOps certification is built on solid competence in pretty much all AWS services. Therefore, before attempting the SysOps exam, you should make sure that, besides this course, you have also worked through the material covered by our three AWS Solutions Architect Associate level courses.
If you have thoughts or suggestions for this course, please contact Cloud Academy at firstname.lastname@example.org.
Hello and welcome to our second lecture. In this lecture, I will present to you our fictional company and show you how to build an environment like ours, so you can build your own and follow this course's examples by yourself.
Imagine a company called Cloud Motors. They are a car company like GM or Ford. They have a highly secure IT on-premises infrastructure. They decide to host an April Fool's Day campaign through AWS.
This will save them money and keep them in compliance with industry standards that they need ensured on their local environment. First of all, this is a rails application, but don't worry. You don't have to know anything about Ruby on Rails to fully understand this course. We will use cloud formation to deploy our application. I will show you how to do it and also what cloud formation is creating for us. The first thing that we need to have to build our environment is a key pair. So let's go to EC2 to create one.
Just go down here, then click to create one. Since I've already got one, I won't create a new one right now. But feel free to choose a great name and create yours.
After ensuring that you have a key pair, go to cloud formation. Now, cloud formation works with stacks. So we need to create a new one to get started. We need to specify a name for our brand new stack.
I will choose Cloud Motors. And we need to specify a template, which could be a sample template, or we can upload a new one to S3, or specify a template that's already at Amazon S3. In our case, we need to download the template that is available at our GitHub repository. You can access the template in here, at the cloud formation folder. Or go down here and click on this link and save the page that will open. Be sure to use the .template extension. Otherwise, you can have problems when trying to upload it to Amazon S3.
All right. Back to cloud formation. Just select the right file and upload it. Now we're ready for the next step. In the parameter page, you can just specify the pre-created key pair. The standard parameters are okay for us. Click next.
Here, we can define tags that AWS will insert in all the resources that this cloud formation template will create.
Click next again. And here we have the review page. We could click in here to check how much the stack will cost us, and also see the URL where our uploaded template is. Let's create our stack and check how our template is visible to the world. It's private by default. And here on the cost page, we can take a look at how much this stack will cost us at the end of the month. We can also tell AWS that we have free tier by clicking in here, and the price will change. I'll stop the recording 'til the creation of our stack is over.
Okay. It took around 20 minutes to complete. Let's have a look at what's changed. First let's look at VPC. Looking at the dashboard, we can tell that there is only the default VPC in here. So the template's not creating anything on VPC with us, which is all right for now. At EC2, it has created some resources for us.
We have one instance already running with public IP, and all the needed settings like security groups. We can see in here that it's open to the world on both SSH and HTTP ports. The template is also creating an RDS database for us. If we go to instances, we can have a look and notice that we now have a single RDS DB instance, which is enough for now.
Let's now have a look at Amazon S3. We have a new bucket in here, but this was not created by the template. It was created by cloud formation itself when we uploaded our template files. We can see also that this bucket is private and has no bucket policies. In your case, you should have only one file in here. In my case, I was using this account to create the cloud formation template that we're using now. Going under permissions, we can see that these files have an ACL set to private, like the bucket itself. And last but not least, let me show you our application. I will use the public DNS address provided by AWS. Voila. Here we have our campaign. And if you would like to buy a brand new smashed car, you just need to click here and fill your order.
But that's not the case for me. Also, you can take a look at the current orders. That's it for this lecture.
- AWS SysOps Administrator Roles and Responsibilities
- Initial Application Setup
- Managing instances
- Deploying an Elastic Load Balancer
- Complete the Elastic infrastructure
- Infrastructure monitoring with CloudWatch
- Industry standards
- Application Security
- The AWS Shared Responsibility Model
- Elastic Beanstalk
Eric Magalhães has a strong background as a Systems Engineer for both Windows and Linux systems and, currently, work as a DevOps Consultant for Embratel. Lazy by nature, he is passionate about automation and anything that can make his job painless, thus his interest in topics like coding, configuration management, containers, CI/CD and cloud computing went from a hobby to an obsession. Currently, he holds multiple AWS certifications and, as a DevOps Consultant, helps clients to understand and implement the DevOps culture in their environments, besides that, he play a key role in the company developing pieces of automation using tools such as Ansible, Chef, Packer, Jenkins and Docker.