Application security
Start course
Difficulty
Advanced
Duration
1h 5m
Students
2442
Ratings
5/5
starstarstarstarstar
Description

Please note: this course has now been removed from our content library. If you want to study for the SysOps Administrator Associate Level Certification, we recommend you take our dedicated learning path for that certification that you can find here.

 

The AWS Certified SysOps Administrator (associate) certification requires its candidates to be comfortable deploying and managing full production operations on AWS. The certification demands familiarity with the whole range of Amazon cloud services, and the ability to choose from among them the most appropriate and cost-effective combination that best fits a given project.

In this exclusive Cloud Academy course, IT Solutions Specialist Eric Magalhães will guide you through an imaginary but realistic scenario that closely reflects many real-world pressures and demands. You'll learn to leverage Amazon's elasticity to effectively and reliably respond to a quickly changing business environment.

The SysOps certification is built on solid competence in pretty much all AWS services. Therefore, before attempting the SysOps exam, you should make sure that, besides this course, you have also worked through the material covered by our three AWS Solutions Architect Associate level courses.

If you have thoughts or suggestions for this course, please contact Cloud Academy at support@cloudacademy.com.

Transcript

Hi and welcome to our ninth lecture. In this lecture, we will start talking about security. We will enable ELB access logs and take a tour of IAM. First, let's go to EC2 to configure our ELB to log access and send log files to S3. We need to select our ELB. Go to access logs and select edit. Here, we enable the logs, set the time and specify an S3 bucket to use. We can create the bucket by checking here. It's configured. Let's now have a look at S3.

Entering the new bucket. We can see that we have a folder, and if we go deeply enough, we will find the test file. ELB will start sending the logs in around 10 minutes. You could later define a life cycle for the logs if you want to. Now, we'll talk about identity and access management, IAM. On the dashboard, we can see some security hints. We will walk through these. First of all, activate MFA, especially for the root account. This is a must. To activate MFA for an IAM user, you can click on the user, go down, and click on manage MFA. In my case, I have one set that I could deactivate or resynchronize.

Let me create a new user so you can see how to enable a new MFA device for a brand new user. I don't want access keys for this user. Click on the user, then manage MFA device. And we have two options: either a virtual or a hardware MFA. Since you would have to purchase a hardware MFA, I will go with virtual. And now, I just have to download Google authenticator or some other tool compatible with AWS and follow the wizard to configure the device.

Very easy. We could manage the user password of the user here, in case the need should arise. We could define custom policies for our users here, or see the ones available for us. By default, there are a few available.

Here, we can define a password policy, which will dictate the rules for creating new passwords and their life cycle as well. I will improve the complexity of the passwords and apply this password policy.

Remember the password policies are not the same as AIM policies. Another way to authenticate users is through access keys.

Access keys are used only through the CLI or an API. By clicking on create access keys, you will generate two things, an access key and a secret key, which works like a user name and password respectively. One of the most interesting elements of IAM is IAM rules. Rules can help us for many things. For instance, let me create one to use with EC2. We can create a role, give it permissions, and then assign this role to an EC2 instance.

And the instance would have exactly the permissions that we defined here. This one, I will give full S3 access. So any instance with this rule would be able to manage all our buckets and add, delete and modify objects. As with the standard IAM user, this is very useful for automations or scripting. Using this role, we do not have to use access keys or make API calls or use the CLI to access resources associated with this role. One of AWS's best practices is to assign permissions using groups. We can create one group here, attach IAM policies to it, and then associate this group with the IAM users that we want to share this access.

It also allows you to be much more efficient. There's no need to create custom policies all the time anymore, and it's very easy to manage, add, and remove users. IAM is useful but it has limits. It has a 5,000 user limit. That might seem like a lot to you, but it's nothing for a big enterprise company. So AWS has another way to authenticate users. You could maintain your user database outside AWS and create new roles there. It works mostly the same way, but this time we don't want an EC2 role. We want a role for identity provider access. We could, for example, integrate our AWS environment with Google or Facebook, or any other provider compatible with AWS standards.

For the users, the log in credentials would be the same as the source user's database, and every time a user authenticates with AWS, he or she would assume the role that we are creating here, with the rights that we can specify in the policies, much like an IAM user. It's a much simpler way to manage a large number of users who need to authenticate in more than one place. We see in here that AWS has set a trusted entity for us, and it has a set of conditions to work. So you could also filter the users that you want to authenticate with this role. IAM can be an extensive topic, but with this basic knowledge, you're already able to do quite a lot. See you at the next lecture.

Lectures:

0

About the Author
Students
29301
Labs
7
Courses
1

Eric Magalhães has a strong background as a Systems Engineer for both Windows and Linux systems and, currently, work as a DevOps Consultant for Embratel. Lazy by nature, he is passionate about automation and anything that can make his job painless, thus his interest in topics like coding, configuration management, containers, CI/CD and cloud computing went from a hobby to an obsession. Currently, he holds multiple AWS certifications and, as a DevOps Consultant, helps clients to understand and implement the DevOps culture in their environments, besides that, he play a key role in the company developing pieces of automation using tools such as Ansible, Chef, Packer, Jenkins and Docker.