With the ever increasing and expanding service catalog being developed by the engineers at AWS, it’s easy to get confused when it comes to understanding which AWS Compute service you need and which service you should be using for your deployments. Which service offers me the quickest deployment?” or “Which service offers the best managed solution?”or “Which AWS Compute service do I need?” are some of the most frequently asked questions.
Whether you are looking for the right compute, storage, database, or networking service, there is an array to choose from, each with a unique list of benefits, use cases, methodologies, and mechanisms to suit your specific need. However, if you don’t know what’s available, you’re likely to incur greater inefficiencies and resources, and consequently, greater costs, as a result.
In this post, we’ll explore the range of compute services available for AWS to help you choose the one that’s right for you.
Getting clear on “compute”
Before going any further, let’s clarify what ‘compute’ actually refers to so that we have an understanding of the services that fall into this category.
Compute resources can be considered the brains and processing power that are required by applications and systems to carry out computational abilities via a series of instructions.
Essentially, compute is closely related to common server components that many of you will be familiar with such as central processing units (CPUs) and random access memory (RAM). With that in mind, a physical server within a data center would be considered a compute resource as it may have multiple CPUs and many gigabytes of RAM to process instructions given by the operating system and applications.
Within AWS, compute resources can be consumed in different quantities, for different periods of time, and across a range of use cases offering a wide scope of performance options. Choosing the right AWS compute resource will really depend on your requirements, so understanding this is key.
With that in mind, you must first define the requirements for your solution: What are you trying to achieve? For example, you may just want to deploy a couple of instances to act as Bastion Hosts within your public subnet of your VPC, or provision a number of servers to act as a web tier receiving HTTP requests for your website. Or, you may need to deploy new applications using Docker within your AWS environment.
These scenarios all require compute resources in some form to implement the solution. However, each would be best implemented using a different compute service. Knowing this can save you time, money, and effort across your deployments.
AWS Compute options
The range of compute services available is growing all the time, with two of the most recent, Amazon Lightsail and AWS Batch, released at the end of 2016. The current AWS Compute category (at the time of this post) consists of six different services. Here is a high-level overview of each:
Amazon Elastic Compute Cloud (EC2)
EC2 is the most common compute service that AWS offers. It allows you to deploy a selection of on-demand instances offering a wide array of different performance benefits within your AWS environment. These can be scaled up and down as necessary.
EC2 Container Service (ECS)
This service allows you to run Docker-enabled applications packaged as containers across a cluster of EC2 instances without requiring you to manage a complex and administratively heavy cluster management system.
AWS Elastic Beanstalk
AWS Elastic Beanstalk is a managed service that will take your uploaded web application code and automatically provision and deploy the appropriate and necessary resources within AWS to make the web application operational. These resources can include other AWS services and features such as an EC2, auto scaling, application health monitoring, and Elastic Load Balancing.
AWS Lambda is a service that allows you to run your own code using only milliseconds of compute resource in response to event driven triggers in a highly available and scalable serverless environment. This makes it easy to build applications that will respond quickly to new information.
This service is used to manage and run batch computing workloads. Batch Computing requires a vast amount of compute power across a cluster of compute resources to complete batch processing by executing a series of jobs or tasks.
Amazon Lightsail is essentially a virtual private server (VPS) backed by AWS infrastructure, much like an EC2 instance but without as many configurable steps throughout its creation. It has been designed to be simple, quick, and easy to use for small scale use cases by small businesses or for single users.
Other compute services
Some compute services have been created in response to the requirements and requests of the community. As the consumers of cloud resources, we want to be able to provision these quickly, reliably, and with minimal manual input to help reduce errors along the way.
AWS recognizes that not every implementation or solution that requires a compute resource fits the parameters and restrictions of existing services such as Amazon EC2 or AWS Elastic Beanstalk. New services are born and developed to meet and serve a specific compute request.
As other technologies such as Docker and serverless computing become more prevalent within the cloud computing environment, the need to develop compute resources that are optimized for these technologies becomes a must. In fact, some of these technologies are only possible by doing so. For example, AWS Lambda allows customers to take advantage of serverless computing. This continual evolution of services ensures that customers can take advantage of the latest technologies within AWS.
Others services have been designed for different purposes, such as enhanced deployment management, which brings convenience and simplicity to the customer, such as AWS Elastic Beanstalk.
The solutions and resources that are provisioned by the AWS Elastic Beanstalk service can be created manually using other services, and by importing your application code. Using Elastic Beanstalk, the manual provisioning and configuration are taken care of by the service itself.
This is perfect for engineers who may not have the familiarity or skills with AWS that they need to deploy, monitor, and scale the correct environment to run their developed applications themselves. Instead, this responsibility is passed on to AWS Elastic Beanstalk to deploy the correct infrastructure to run the uploaded code. This provides a simple, effective, and quick solution to the application deployment rollout.
As I mentioned previously, it’s important to understand which service options are available to you. Selecting the most appropriate service for your needs can help you to save money by reducing the amount of internal effort required from a personnel perspective alone. Using the AWS Elastic Beanstalk service example above, if you moved from manual deployments to using this service, then time, efficiency, resource, and cost will ALL benefit by passing additional responsibility onto the AWS service specifically around provisioning.
This ultimately allows you to spend more time in developing great new applications and less time on planning your deployment strategies.
Which AWS Compute service do I need?
So, when you find yourself asking “Which AWS Compute service do I need?”, here are some questions that you’ll need to answer:
- What is the end goal for your deployment? Which aspect is most important to the solution: Is it deployment time, simplicity, management, security, responsibility, cost, or something else? Knowing this will help you select the features that best meet your requirements.
- What are your compute requirements from a performance perspective? How much CPU, memory, and network bandwidth do you need? Although some services do not require all of this information, it’s still recommended that you know the minimum specifications for your application or service deployment.
- Do you know which AWS Compute options are available to you that are suitable for your deployment? If the answer is no, I recommend that you invest time and effort into gaining this knowledge as it will ultimately help you deploy a robust and cost-effective solution.
If you would like to know more about the AWS Compute services in greater detail, I highly recommend our “Compute Fundamentals for AWS” course.
On completion of the 90+ minute course, you will:
- Understand compute resources
- Be able to explain each of the compute resources used within AWS
- Be able to select the most appropriate compute resource based on your requirements
- Understand the benefits of Elastic Load Balancing and Auto Scaling and how they can work together to manage resource demand
The topics covered within this course include:
- What is Compute?
- Amazon Elastic Compute Cloud (EC2)
- Elastic Load Balancing & Auto Scaling
- Amazon ECS
- AWS Elastic Beanstalk
- AWS Lambda
- AWS Batch
- Amazon Lightsail
What Exactly Is a Cloud Architect and How Do You Become One?
One of the buzzwords surrounding the cloud that I'm sure you've heard is "Cloud Architect." In this article, I will outline my understanding of what a cloud architect does and I'll analyze the skills and certifications necessary to become one. I will also list some of the types of jobs ...
Boto: Using Python to Automate AWS Services
Boto allows you to write scripts to automate things like starting AWS EC2 instances Boto is a Python package that provides programmatic connectivity to Amazon Web Services (AWS). AWS offers a range of services for dynamically scaling servers including the core compute service, Elastic...
Content Roadmap: AZ-500, ITIL 4, MS-100, Google Cloud Associate Engineer, and More
Last month, Cloud Academy joined forces with QA, the UK’s largest B2B skills provider, and it put us in an excellent position to solve a massive skills gap problem. As a result of this collaboration, you will see our training library grow with additions from QA’s massive catalog of 500+...
DevSecOps: How to Secure DevOps Environments
Security has been a friction point when discussing DevOps. This stems from the assumption that DevOps teams move too fast to handle security concerns. This makes sense if Information Security (InfoSec) is separate from the DevOps value stream, or if development velocity exceeds the band...
Test Your Cloud Knowledge on AWS, Azure, or Google Cloud Platform
Cloud skills are in demand | In today's digital era, employers are constantly seeking skilled professionals with working knowledge of AWS, Azure, and Google Cloud Platform. According to the 2019 Trends in Cloud Transformation report by 451 Research: Business and IT transformations re...
Disadvantages of Cloud Computing
If you want to deliver digital services of any kind, you’ll need to estimate all types of resources, not the least of which are CPU, memory, storage, and network connectivity. Which resources you choose for your delivery — cloud-based or local — is up to you. But you’ll definitely want...
Google Cloud vs AWS: A Comparison (or can they be compared?)
The "Google Cloud vs AWS" argument used to be a common discussion among our members, but is this still really a thing? You may already know that there are three major players in the public cloud platforms arena: Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP)...
Deployment Orchestration with AWS Elastic Beanstalk
If you're responsible for the development and deployment of web applications within your AWS environment for your organization, then it's likely you've heard of AWS Elastic Beanstalk. If you are new to this service, or simply need to know a bit more about the service and the benefits th...
How to Use & Install the AWS CLI
What is the AWS CLI? | The AWS Command Line Interface (CLI) is for managing your AWS services from a terminal session on your own client, allowing you to control and configure multiple AWS services and implement a level of automation. If you’ve been using AWS for some time and feel...
Cloud Academy’s Blog Digest: July 2019
July has been a very exciting month for us at Cloud Academy. On July 10, we officially joined forces with QA, the UK’s largest B2B skills provider (read the announcement). Over the coming weeks, you will see additions from QA’s massive catalog of 500+ certification courses and 1500+ ins...
AWS Fundamentals: Understanding Compute, Storage, Database, Networking & Security
If you are just starting out on your journey toward mastering AWS cloud computing, then your first stop should be to understand the AWS fundamentals. This will enable you to get a solid foundation to then expand your knowledge across the entire AWS service catalog. It can be both d...
How to Become a DevOps Engineer
The DevOps Handbook introduces DevOps as a framework for improving the process for converting a business hypothesis into a technology-enabled service that delivers value to the customer. This process is called the value stream. Accelerate finds that applying DevOps principles of flow, f...