In this course, we will examine various options for operating programmatically with AWS, including Application Programming Interfaces, or APIs; Software Development Kits, or SDKs; and the AWS Command Line Interface, or CLI.
- A high-level introduction to AWS APIs and SDKs; the AWS CLI; and the concept of infrastructure as code
- System administrators
- Anyone else who is looking to learn different ways to automate or scale their interactions with AWS
- You should have a basic understanding of AWS and its services
- You should also have some exposure to interacting with a command-line shell or terminal using an operating system such as Windows, Linux, or macOS
Hello, and welcome to this lecture, where I will be examining the various options we have when provisioning and operating programmatically with AWS, including:
Application Programming Interfaces, or APIs;
Software Development Kits, or SDKs; the
AWS Command Line Interface, or CLI; and
Infrastructure as Code, or IaC.
Each of these options has different use cases, which I will cover throughout this lecture. Now you’re probably already familiar with the browser-based AWS console and how it can be used to provision and manage resources. And while the console may be useful for performing simple, one-time tasks like launching a single EC2 instance or creating an S3 bucket, more robust options exist for developers and system administrators who need to automate repetitive tasks or perform actions at scale, like launching twenty EC2 instances or creating fifty buckets in S3. In these cases, it makes more sense to interact with AWS programmatically using code and scripts rather than just pointing and clicking through the console, where completing such actions would require a lengthy series of manual, error-prone steps that must be completed sequentially, one at a time rather than in a single batch. So let’s begin our survey of options for operating programmatically with AWS by taking a look at APIs and SDKs.
APIs enable different software systems and applications to communicate with one another through published, agreed-upon interfaces. Most actions you perform within the AWS console ultimately invoke one or more AWS APIs under the covers. So when you click a button in the AWS console to create an S3 bucket, an S3 API action called CreateBucket is actually being invoked. And because you’re already signed in to the console with a user who is properly authenticated and authorized, that API call to the CreateBucket action will succeed. But because of the open nature of APIs, it’s also possible for developers to write their own code that calls the same CreateBucket API action directly from within a custom application and results in the same outcome: the creation of a new S3 bucket.
Keep in mind, however, that when developers write custom applications that invoke AWS APIs, they are responsible for properly signing each API request that requires authorization with an AWS access key ID and secret access key. This process requires developers to include either a special HTTP authorization header or a unique signature query string value as part of each API request. This allows AWS to verify the identity of the requester and ensure they are authorized to perform the requested operation. API requests may also need to include special hashed values or timestamps to ensure the integrity of the request data being sent in transit, as well as to protect against any replay attacks. And it’s worth noting that all API requests are logged in AWS CloudTrail, regardless of whether the request originated from the console, from custom code, or from any of the AWS SDKs we’re about to discuss.
Because of the additional complexity I just mentioned that is required when executing AWS API calls directly from custom code, AWS has also developed a number of language-specific APIs for AWS services known as Software Development Kits, or SDKs. These SDKs integrate with many common development languages and platforms and automatically handle this process of crafting valid API requests that are properly signed. AWS SDKs are available for programming languages including:
.NET, among many others.
So things like APIs and SDKs are useful for software developers, but what about system administrators, or anyone else looking to automate or streamline their interactions with AWS using command line tools and scripts? For these users, the AWS CLI may be a more appropriate solution. Once you’ve installed and configured the AWS CLI, you can begin making calls to AWS services directly from the command line. And because you specify a set of credentials to use with the CLI as part of the initial configuration process, all calls made from the CLI will be automatically authenticated using those credentials. Keep in mind, however, that in order for the call to succeed, the user must also have permissions and be authorized to perform that action. Now in addition to the overarching AWS CLI, AWS also offers additional specialized command-line tools including PowerShell Tools for Windows users, along with dedicated CLIs for specific services such as:
The Elastic Container Service, or ECS, and the
AWS Serverless Application Model, or SAM.
All of these command-line tools support the use of scripts to enable automation and can be used as part of a continuous integration and continuous deployment, or CI/CD pipeline in a DevOps environment. And these scripts can be version controlled using a repository like git to centrally track and manage changes to them over time. But perhaps more importantly, using scripts to automate common tasks reduces the opportunity for human error and supports more robust, reliable, and repeatable processes across your enterprise. And just like I mentioned with the APIs and SDKs, every interaction with AWS via the command line ultimately maps to an AWS API call that can be captured in AWS CloudTrail for auditing purposes.
To learn more about AWS developer services for CI/CD and AWS CloudTrail, I invite you to check out our existing content here:
Now, one final aspect of operating programmatically with AWS that I’d like to mention is infrastructure as code. So up to this point, we’ve discussed how we can use code and scripts to automate our interactions with AWS for things like provisioning resources. But it’s also possible for us to leverage these same principles when it comes to representing the infrastructure itself as well as its configuration. So let’s say for instance that we’ve gone through the trouble of setting up a highly available application with load balancers, EC2 instances, RDS databases, and S3 buckets all within a single AWS region. And maybe we configured all of this manually through the console when we first set it up.
But now let’s say we need to stand up this exact same infrastructure in another AWS region for disaster recovery purposes. What is the likelihood that, using just the console, we could replicate our entire application’s infrastructure, along with all of its configuration settings in another region such that the setup in both regions is completely identical? Well, it’s possible, of course, but wouldn’t it be easier if we could declaratively represent all of our infrastructure as code so that we could reproduce our entire environment on demand, as many times as we needed and across as many regions and accounts as we needed? That’s where CloudFormation comes into play.
With CloudFormation, you create templates that represent your infrastructure in something called a stack. And to get started, AWS makes many sample templates available for you to download and customize here. You can then use these templates to quickly provision all of the resources contained within a stack in a repeatable, consistent way. Now, if you’re interested in exploring CloudFormation more in-depth, I encourage you to check out our AWS CloudFormation course here.
Danny has over 20 years of IT experience as a software developer, cloud engineer, and technical trainer. After attending a conference on cloud computing in 2009, he knew he wanted to build his career around what was still a very new, emerging technology at the time — and share this transformational knowledge with others. He has spoken to IT professional audiences at local, regional, and national user groups and conferences. He has delivered in-person classroom and virtual training, interactive webinars, and authored video training courses covering many different technologies, including Amazon Web Services. He currently has six active AWS certifications, including certifications at the Professional and Specialty level.