Designing Highly Available, Cost Efficient Cloud Solutions
Developing Cloud Solutions
An introduction to the AWS components that help us develop highly available, cost-efficient solutions.
- Understand the core AWS services, uses, and basic architecture best practices
- Identify and recognize cloud architecture considerations, such as fundamental components and effective designs
Elasticity and Scalability
Regions and AZ's
Amazon Elastic Load Balancer
Amazon Simple Queue Service
Amazon Elastic IP Addresses
Amazon Auto Scaling
Identify the appropriate techniques to code a cloud solution
Recognize and implement secure procedures for optimum cloud deployment and maintenance
Using Amazon SQS
Using Amazon SNS
Using Amazon SWF
Using Cross Origin Resources (CORS)
If you have thoughts or suggestions for this course, please contact Cloud Academy at firstname.lastname@example.org.
Welcome back. Over the next minutes I will walk you through some topics needed for the exam related to APIs, SDKs and HTTP responses. API stands for Application Program Interface, AWS works for the rest API. I will probably never need to use the API directly so don't worry about it. The AWS API communicates over the Hypertext Transfer Protocol, also known as HTTP. The other topics of this lecture are all based somehow on the behavior of this API. SDK stands for Software Development Kit. The job of an SDK is to simplify using AWS services in your applications with an API tailored to your programming language platform. This means, in other words, that all the SDKs are going to communicate to AWS by the API to be described in the last slide. In the exam you don't need to know how to use any of the SDKs, you simply need to know what the official SDKs are, and the most common AWS API calls for each service; this last detail can be painful.
The official SDKs available are these, you'll only need to know that regarding the SDKs for the certification exam. If you're a developer, you're probably familiar with these HTTP responses, but since this is a course for everyone I'll quickly summarize them. The first ones are the 100 codes used only for informational purposes, 200 means the request was successful, 300 refers to re-directions, 400 represents client-side errors and 500 represents server-side errors. Since the AWS API uses HTTP to communicate, those are the only kinds of responses that interest us. On this table I've put some examples of S3 error responses with the associated HTTP status code. You should be aware that the list is actually much bigger. Let's now get some hands-on experience with AWS SDK and see some HTTP responses. The best SDK for that in my opinion is the Python SDK that is called boto. I've set up an instance on AWS with an IM roll to perform a few actions with it, My IM roll has full access to S3. This is an Amazon Linux instance and it already has a Python version two SDK installed.
Beware that the boto SDK is currently up to version three, but version two still shows HTTP response code and that's why we're gonna use it. Let's open the S3 console to keep track of what's going on.
I only have the bucket that I created for the course and pre-signed URL lecture; to use the Python SDK, let's open the terminal and type Python. Now we need to import the SDK. We just need to type import boto and we're good to go. To work with S3 we need to have a connection object that I will assign to a variable called con. If you were not using an IM roll here we would have to insert our access keys with this command. Here is our connection object good so far, now let's use this object to create a bucket for us.
Just type the command and specify the bucket name. I'm pretty sure that a bucket with this name already exists somewhere in the Amazon system and I expect to receive an error. As I said here is the error, and we can also see the HTTP response code associated with it which is 409 conflict.
So let me now try to create a bucket with a more original name and see what happens. The response is a bucket object. This means that our bucket was successfully created, we can check it over here. Now let's do something else. Let's delete this same bucket.
Since there was no error, our bucket was deleted as expected. Let me create the bucket again and then I'll put something inside the bucket using the Python SDK. This could be a little bit more tricky but not too complicated if you're familiar with Python. First create an object for our bucket and assign it to a variable, I will call it Max.
The object is created. Now we need to import a class to our code. , Done. Now we can create objects with the key class, keys are the file objects in a bucket.
I'll now assign a name to our file, I will call it secret.text. Now we need to put some content in this file. I'll do it with a method that inserts a defined string into a file.
That's enough for us, done. Now we can check our file on the S3 console to see if everything is okay. So far so good. But this file is set to private because that is S3's default behavior. You know that right? Access denied, remember that lecture on pre-signed URLs where I mention that you need in SDK to generate them? Let's have a look at how that's done, I'm creating a variable to hold the URL. Now I'll call our con object and here I can specify the time to live for this URL in seconds, let's say one minute. And the type of permissions that I'm granting, which in this case is get, because remember it's talking to the AWS API. Now I only have to define the bucket name and also the name of the file and we're all set.
Done. If we check the URL we can copy and paste it in the browser to see if it's working. Nice, it works. Here's our top secret text file that we created with the Python SDK.
I hope you enjoyed this lecture.
Andrew is fanatical about helping business teams gain the maximum ROI possible from adopting, using, and optimizing Public Cloud Services. Having built 70+ Cloud Academy courses, Andrew has helped over 50,000 students master cloud computing by sharing the skills and experiences he gained during 20+ years leading digital teams in code and consulting. Before joining Cloud Academy, Andrew worked for AWS and for AWS technology partners Ooyala and Adobe.