From Monolith to Serverless – The Evolving Cloudscape of Compute
Containers can help fragment monoliths into logical, easier to use workloads. The AWS Summit New York was held on July 17 and Cloud Academy sponsor...Learn More
With the ever increasing and expanding service catalog being developed by the engineers at AWS, it’s easy to get confused when it comes to understanding which AWS Compute service you need and which service you should be using for your deployments. Which service offers me the quickest deployment?” or “Which service offers the best managed solution?”or “Which AWS Compute service do I need?” are some of the most frequently asked questions.
Whether you are looking for the right compute, storage, database, or networking service, there is an array to choose from, each with a unique list of benefits, use cases, methodologies, and mechanisms to suit your specific need. However, if you don’t know what’s available, you’re likely to incur greater inefficiencies and resources, and consequently, greater costs, as a result.
In this post, we’ll explore the range of compute services available for AWS to help you choose the one that’s right for you.
Before going any further, let’s clarify what ‘compute’ actually refers to so that we have an understanding of the services that fall into this category.
Compute resources can be considered the brains and processing power that are required by applications and systems to carry out computational abilities via a series of instructions.
Essentially, compute is closely related to common server components that many of you will be familiar with such as central processing units (CPUs) and random access memory (RAM). With that in mind, a physical server within a data center would be considered a compute resource as it may have multiple CPUs and many gigabytes of RAM to process instructions given by the operating system and applications.
Within AWS, compute resources can be consumed in different quantities, for different periods of time, and across a range of use cases offering a wide scope of performance options. Choosing the right AWS compute resource will really depend on your requirements, so understanding this is key.
With that in mind, you must first define the requirements for your solution: What are you trying to achieve? For example, you may just want to deploy a couple of instances to act as Bastion Hosts within your public subnet of your VPC, or provision a number of servers to act as a web tier receiving HTTP requests for your website. Or, you may need to deploy new applications using Docker within your AWS environment.
These scenarios all require compute resources in some form to implement the solution. However, each would be best implemented using a different compute service. Knowing this can save you time, money, and effort across your deployments.
The range of compute services available is growing all the time, with two of the most recent, Amazon Lightsail and AWS Batch, released at the end of 2016. The current AWS Compute category (at the time of this post) consists of six different services. Here is a high-level overview of each:
EC2 is the most common compute service that AWS offers. It allows you to deploy a selection of on-demand instances offering a wide array of different performance benefits within your AWS environment. These can be scaled up and down as necessary.
This service allows you to run Docker-enabled applications packaged as containers across a cluster of EC2 instances without requiring you to manage a complex and administratively heavy cluster management system.
AWS Elastic Beanstalk is a managed service that will take your uploaded web application code and automatically provision and deploy the appropriate and necessary resources within AWS to make the web application operational. These resources can include other AWS services and features such as an EC2, auto scaling, application health monitoring, and Elastic Load Balancing.
AWS Lambda is a service that allows you to run your own code using only milliseconds of compute resource in response to event driven triggers in a highly available and scalable serverless environment. This makes it easy to build applications that will respond quickly to new information.
This service is used to manage and run batch computing workloads. Batch Computing requires a vast amount of compute power across a cluster of compute resources to complete batch processing by executing a series of jobs or tasks.
Amazon Lightsail is essentially a virtual private server (VPS) backed by AWS infrastructure, much like an EC2 instance but without as many configurable steps throughout its creation. It has been designed to be simple, quick, and easy to use for small scale use cases by small businesses or for single users.
Some compute services have been created in response to the requirements and requests of the community. As the consumers of cloud resources, we want to be able to provision these quickly, reliably, and with minimal manual input to help reduce errors along the way.
AWS recognizes that not every implementation or solution that requires a compute resource fits the parameters and restrictions of existing services such as Amazon EC2 or AWS Elastic Beanstalk. New services are born and developed to meet and serve a specific compute request.
As other technologies such as Docker and serverless computing become more prevalent within the cloud computing environment, the need to develop compute resources that are optimized for these technologies becomes a must. In fact, some of these technologies are only possible by doing so. For example, AWS Lambda allows customers to take advantage of serverless computing. This continual evolution of services ensures that customers can take advantage of the latest technologies within AWS.
Others services have been designed for different purposes, such as enhanced deployment management, which brings convenience and simplicity to the customer, such as AWS Elastic Beanstalk.
The solutions and resources that are provisioned by the AWS Elastic Beanstalk service can be created manually using other services, and by importing your application code. Using Elastic Beanstalk, the manual provisioning and configuration are taken care of by the service itself.
This is perfect for engineers who may not have the familiarity or skills with AWS that they need to deploy, monitor, and scale the correct environment to run their developed applications themselves. Instead, this responsibility is passed on to AWS Elastic Beanstalk to deploy the correct infrastructure to run the uploaded code. This provides a simple, effective, and quick solution to the application deployment rollout.
As I mentioned previously, it’s important to understand which service options are available to you. Selecting the most appropriate service for your needs can help you to save money by reducing the amount of internal effort required from a personnel perspective alone. Using the AWS Elastic Beanstalk service example above, if you moved from manual deployments to using this service, then time, efficiency, resource, and cost will ALL benefit by passing additional responsibility onto the AWS service specifically around provisioning.
This ultimately allows you to spend more time in developing great new applications and less time on planning your deployment strategies.
So, when you find yourself asking “Which AWS Compute service do I need?”, here are some questions that you’ll need to answer:
If you would like to know more about the AWS Compute services in greater detail, I highly recommend our “Compute Fundamentals for AWS” course.
On completion of the 90+ minute course, you will:
The topics covered within this course include:
AWS's WaitCondition can be used with CloudFormation templates to ensure required resources are running.As you may already be aware, AWS CloudFormation is used for infrastructure automation by allowing you to write JSON templates to automatically install, configure, and bootstrap your ...
As companies increasingly shift workloads to the public cloud, cloud computing has moved from a nice-to-have to a core competency in the enterprise. This shift requires a new set of skills to design, deploy, and manage applications in the cloud.As the market leader and most mature p...
The announcements at re:Invent just keep on coming! Let’s look at what benefits these two new EC2 instance types offer and how these two new instances could be of benefit to you. If you're not too familiar with Amazon EC2, you might want to familiarize yourself by creating your first Am...
Google Cloud Platform (GCP) has evolved from being a niche player to a serious competitor to Amazon Web Services and Microsoft Azure. In 2018, research firm Gartner placed Google in the Leaders quadrant in its Magic Quadrant for Cloud Infrastructure as a Service for the first time. In t...
In order to understand AWS VPC egress filtering methods, you first need to understand that security on AWS is governed by a shared responsibility model where both vendor and subscriber have various operational responsibilities. AWS assumes responsibility for the underlying infrastructur...
Is it possible to create an S3 FTP file backup/transfer solution, minimizing associated file storage and capacity planning administration headache?FTP (File Transfer Protocol) is a fast and convenient way to transfer large files over the Internet. You might, at some point, have conf...
Microservices are a way of breaking large software projects into loosely coupled modules, which communicate with each other through simple Application Programming Interfaces (APIs).Microservices have become increasingly popular over the past few years. The modular architectural style,...
There are many use cases for tags, but what are the best practices for tagging AWS resources? In order for your organization to effectively manage resources (and your monthly AWS bill), you need to implement and adopt a thoughtful tagging strategy that makes sense for your business. The...
Amazon S3 is the most common storage options for many organizations, being object storage it is used for a wide variety of data types, from the smallest objects to huge datasets. All in all, Amazon S3 is a great service to store a wide scope of data types in a highly available and resil...
One of the main promises of cloud computing is access to nearly endless capacity. However, it doesn’t come cheap. With the introduction of Spot Instances for Amazon Web Services’ Elastic Compute Cloud (AWS EC2) in 2009, spot instances have been a way for major cloud providers to sell sp...
A Comparison of Machine Learning Services on AWS, Azure, and Google CloudArtificial intelligence and machine learning are steadily making their way into enterprise applications in areas such as customer support, fraud detection, and business intelligence. There is every reason to beli...
The AWS Command Line Interface (CLI) is for managing your AWS services from a terminal session on your own client, allowing you to control and configure multiple AWS services.So you’ve been using AWS for awhile and finally feel comfortable clicking your way through all the services....