Amazon EC2 Container Service and Elastic Beanstalk: Docker on AWS

Technology keeps moving. In just a few years, we’ve gone from servers running on dedicated hardware, through virtualization, and then cloud computing. And now we’ve reached the container age. As we will see, the Amazon EC2 Container Service (ECS) has made containers a major element of their deployment family. We’ll soon discuss three ways to run Docker containers on AWS, but first a few words about why you would want to.

Traditional servers living in data centers could do one thing at a time. There would often have been one server for Active Directory, another for Exchange, and a third running a database. But it was common for any or all of these servers to have unused excess processing capacity going to waste. Virtualization was developed to make use of that extra capacity. Through virtualization, a single physical server could host multiple virtual servers, allowing for far greater scalability and more efficient resource utilization.

But even virtualization required significant up-front hardware (and software license) investments and couldn’t completely remove the risks of excess capacity. That’s where cloud computing – with its multi-tenancy models, on-demand pricing, and dynamic scalability – proved so powerful.

But modern DevOps practices demand the ability to quickly build servers and ship code to be run in different environments. Welcome to the world of containers: extremely lightweight, abstracted user-space instances that can be easily launched on any compatible server and reliably provide a predictable experience. Containers – of which the best-known example is Docker – are able to achieve all this by capitalizing on the ability to share the Linux kernel of any host.

Docker

The DevOps development process depends, in large part, on portability. Even a slight difference between your development, test, and production environments may completely break your application. Traditional development models follow a change management process to solve these kinds of problems. But this process won’t fit in today’s rapid build and deploy cycles. Docker streamlines and automates the software development process. You can simply package an application with references to all its dependencies into standardized containers (defined by plain-text Dockerfiles) that include everything needed to run your application wherever you end up shipping it.

Docker uses a client-server architecture. The docker client talks to a host daemon which both builds and then runs the container.

In this post, we won’t talk all that much about running Docker natively. Instead, we will concentrate on AWS deployments. We’ll explore three ways to do that:

  1. Deploying Docker containers directly from an Ec2 instance.
  2. Using Docker containers on Elastic Beanstalk.
  3. Docker cluster management using the AWS EC2 Container Service.

Deploying Docker containers on an Ec2 instance

Docker can run anywhere, on a racked server, an old laptop, and perhaps, if you worked at it hard enough, even on a smartphone. So, since it requires only the Linux kernel to run, there’s definitely no reason why it shouldn’t work on a Linux-powered Amazon EC2 instance. Let’s see how we can install and run Docker on Amazon Linux.

    1. Launch an Amazon Linux Instance.
    2. Install the docker engine.
      sudo yum install -y docker
    3. SSH into your EC2 instance and start the Docker service.
      sudo service docker start
    4. Sign up for a Docker Hub Account.
    5. Search for Ubuntu OS in Docker Hub.
      sudo docker search ubuntu
    6. Download the ubuntu image.
      sudo docker pull
    7. start the container. The -t and -i flags allocate a pseudo-tty and keep stdin open.
      sudo docker run -t -i ubuntu /bin/bash
    8. At the regular BASH prompt (not in the Docker shell), install the Apache webserver.
      apt-get update && apt-get install apache2
    9. Save the container. We need to commit these changes to our container using the container ID and the image name.
      docker commit a2d424f5655ea nitheesh86/apache
    10. Push the changes to docker hub.
      docker push nitheesh86/apache

At this point, sharing the container with others is easy. Just run this command on any other server, anywhere on the Internet:

docker run -d -p 80:80 nitheesh86/apache /usr/sbin/apache2ctl -D FOREGROUND

Using Docker containers on Elastic Beanstalk

Elastic Beanstalk is an AWS PaaS platform. All you have to do is upload your application code, and Elastic Beanstalk takes care of the deployment, load balancing, and capacity provisioning. With a single click, you can start all the necessary application servers running. There is no charge for the Elastic Beanstalk service itself, just for the AWS resources you actually use.

EC2 Container Service Step-by-Step Update Process
With a single click, you can start all the necessary application servers running.

Since AWS integrated Docker into Elastic Beanstalk, you can build and test your application on your local workstation, and then directly deploy it on Elastic Beanstalk. As always, you install all the required software and dependencies through the Dockerfile. This file will include the image to be used (Ubuntu, CentOS, etc), and the volume to be mapped. Any new EC2 instance Elastic Beanstalk will launch to run your application will be configured based on the commands passed by your Dockerfile.

Let’s deploy a sample application on a Docker container

  1. Create a Dockerrun.aws.json file to deploy an existing Docker image. This file describes how to deploy a Docker container as an Elastic Beanstalk application. Your dockerrun.aws.json file should look something like this:

    Dockerrun.aws.json fileLet me explain this, line by line.

    • AWSEBDockerrunVersion – Specifies the version number as the value “1” for Single Container Docker environments.
    • Image – Specifies the Docker base image on an existing Docker repository from which you’re building your Docker container.
    • Ports – Lists the ports to expose on the Docker container.
    • Volumes – Maps volumes from an EC2 instance to your Docker container.
    • Logging – Maps the log directory inside the container.
  2. Create a zip file including Dockerfile or Dockerrun.aws.json and any application file, then upload to Elastic Beanstalk.
  3. From there you can follow this guide deploy your application.

Docker cluster management using the EC2 Container service (ECS)

When your container grows from one to many, maintenance can become a significant burden. Managing containers at scale requires proper cluster management software. Since “Amazon” and “scale” are like two sides of the same coin, it wasn’t much of a surprise when they created the AWS EC2 Container Service to handle the installation, operation, scaling and general cluster management for you.

Container instances communicate with the ECS service through an “ECS agent.” When you create a new cluster, an Amazon ECS-enabled AMI will be copied into your cluster.

EC2 Container Service Benefits

  • Simplified Cluster Management
    ECS creates and manages clusters of Docker Containers. It can scale from one to thousands of containers, spread across multiple Availability Zones.
  • Easy Scheduling
    ECS supports its own scheduler to spread containers out across your cluster, to balance availability and utilization.
  • Portability
    ECS uses the standard Docker daemon so that you can easily move your on-premises application to the AWS cloud and vice versa.
  • Resource Utilization
    A containerized application can make very efficient use of resources. You can choose to run multiple containers on the same EC2 instance.
  • Integrated with AWS services
    You can access most AWS features, such as Elastic IP addresses VPC, ELB, and EBS.

Defining Amazon EC2 Container Service Building Blocks

Diagram Showing Amazon EC2 Container Service Building Blocks 
  • Cluster
    A group of EC2 instances managed by ECS that lives inside your VPC. One cluster can contain multiple instance types and sizes and can be stretched across multiple Availability Zones.
  • Scheduler
    An integral part of each cluster. The scheduler is responsible for assigning containers to instances.
  • Container
    The virtual machines that execute your application. You can run any number of containers in a single cluster.
  • Task Definition
    Blue prints for your application that define the way your containers will work.
  • ECS AMI
    An Amazon Machine Image that includes the ECS Agent.

Launch a simple EC2 Container Service container

Create a cluster
Create task definitions. To get used to the process the first time through, I would suggest you stick with the sample task definition that’s provided.
Create an Amazon ECS sample task Create a task definition from the sample. You have the option of modifying the parameters (CPU resources or change the port mappings) in the task definition.

EC2 Container Service - Defining Parameters
EC2 Container Service Naming Containers

You can schedule a task to run once (which is ideal for batch jobs) or create a service to launch and maintain a specified number of copies of the task definition in your cluster. So that the sample application we are running will run continuously, let’s choose “Create a Service.”
EC2 Container Service Scheduling Tasks
Here’s the final step to launch our containers. The more instances you have in our cluster, the more tasks we can give them. Click Review, and Launch.
EC2 Container Service Configuring Clusters
Click on your cluster and at the bottom, you’ll see an external link for your application.
EC2 Container Service Cluster External Link
When you click on that link you should be able to see your sample application running in the browser.
EC2 Container Service Sample App Successfully Created

Avatar

Written by

Nitheesh Poojary

My professional IT career began nine years back when I was just out of my college. I worked with a great team as an infrastructure management engineer, managing hundreds of enterprise application servers. I found my passion when I got the opportunity to work with Cloud technologies: I'm addicted to AWS Cloud Services, DevOps engineering, and all the cloud tools and technologies that make engineers' lives easier. Currently, I am working as a Solution Architect in SixNines IT. We are an experienced team of engineers that have helped hundreds of customers move to the cloud responsibly. I have achieved 5 AWS certifications, happily helping fellow engineers across the globe through my blogs and answering questions in various forums.

Related Posts

Alisha Reyes
Alisha Reyes
— August 22, 2019

How to Unlock Complimentary Access to Cloud Academy

Are you looking to get trained or certified on AWS, Azure, Google Cloud Platform, DevOps, Cloud Security, Python, Java, or another technical skill? Then you'll want to mark your calendars for August 23, 2019. Starting Friday at 12:00 a.m. PDT (3:00 a.m. EDT), Cloud Academy is offering c...

Read more
  • AWS
  • Azure
  • cloud academy content
  • complimentary access
  • GCP
  • on the house
Avatar
Michael Sheehy
— August 19, 2019

What Exactly Is a Cloud Architect and How Do You Become One?

One of the buzzwords surrounding the cloud that I'm sure you've heard is "Cloud Architect." In this article, I will outline my understanding of what a cloud architect does and I'll analyze the skills and certifications necessary to become one. I will also list some of the types of jobs ...

Read more
  • AWS
  • Cloud Computing
Avatar
Nitheesh Poojary
— August 19, 2019

Boto: Using Python to Automate AWS Services

Boto allows you to write scripts to automate things like starting AWS EC2 instances Boto is a Python package that provides programmatic connectivity to Amazon Web Services (AWS). AWS offers a range of services for dynamically scaling servers including the core compute service, Elastic...

Read more
  • Automated AWS Services
  • AWS
  • Boto
  • Python
Avatar
Andrew Larkin
— August 13, 2019

Content Roadmap: AZ-500, ITIL 4, MS-100, Google Cloud Associate Engineer, and More

Last month, Cloud Academy joined forces with QA, the UK’s largest B2B skills provider, and it put us in an excellent position to solve a massive skills gap problem. As a result of this collaboration, you will see our training library grow with additions from QA’s massive catalog of 500+...

Read more
  • AWS
  • Azure
  • content roadmap
  • Google Cloud Platform
Avatar
Adam Hawkins
— August 9, 2019

DevSecOps: How to Secure DevOps Environments

Security has been a friction point when discussing DevOps. This stems from the assumption that DevOps teams move too fast to handle security concerns. This makes sense if Information Security (InfoSec) is separate from the DevOps value stream, or if development velocity exceeds the band...

Read more
  • AWS
  • cloud security
  • DevOps
  • DevSecOps
  • Security
Avatar
Stefano Giacone
— August 8, 2019

Test Your Cloud Knowledge on AWS, Azure, or Google Cloud Platform

Cloud skills are in demand | In today's digital era, employers are constantly seeking skilled professionals with working knowledge of AWS, Azure, and Google Cloud Platform. According to the 2019 Trends in Cloud Transformation report by 451 Research: Business and IT transformations re...

Read more
  • AWS
  • Cloud skills
  • Google Cloud
  • Microsoft Azure
Avatar
Andrew Larkin
— August 7, 2019

Disadvantages of Cloud Computing

If you want to deliver digital services of any kind, you’ll need to estimate all types of resources, not the least of which are CPU, memory, storage, and network connectivity. Which resources you choose for your delivery —  cloud-based or local — is up to you. But you’ll definitely want...

Read more
  • AWS
  • Azure
  • Cloud Computing
  • Google Cloud Platform
Joe Nemer
Joe Nemer
— August 6, 2019

Google Cloud vs AWS: A Comparison (or can they be compared?)

The "Google Cloud vs AWS" argument used to be a common discussion among our members, but is this still really a thing? You may already know that there are three major players in the public cloud platforms arena: Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP)...

Read more
  • AWS
  • Google Cloud Platform
  • Kubernetes
Avatar
Stuart Scott
— July 29, 2019

Deployment Orchestration with AWS Elastic Beanstalk

If you're responsible for the development and deployment of web applications within your AWS environment for your organization, then it's likely you've heard of AWS Elastic Beanstalk. If you are new to this service, or simply need to know a bit more about the service and the benefits th...

Read more
  • AWS
  • elastic beanstalk
Avatar
Stuart Scott
— July 26, 2019

How to Use & Install the AWS CLI

What is the AWS CLI? | The AWS Command Line Interface (CLI) is for managing your AWS services from a terminal session on your own client, allowing you to control and configure multiple AWS services and implement a level of automation. If you’ve been using AWS for some time and feel...

Read more
  • AWS
  • AWS CLI
  • Command line interface
Alisha Reyes
Alisha Reyes
— July 22, 2019

Cloud Academy’s Blog Digest: July 2019

July has been a very exciting month for us at Cloud Academy. On July 10, we officially joined forces with QA, the UK’s largest B2B skills provider (read the announcement). Over the coming weeks, you will see additions from QA’s massive catalog of 500+ certification courses and 1500+ ins...

Read more
  • AWS
  • Azure
  • Cloud Academy
  • Cybersecurity
  • DevOps
  • Kubernetes
Avatar
Stuart Scott
— July 18, 2019

AWS Fundamentals: Understanding Compute, Storage, Database, Networking & Security

If you are just starting out on your journey toward mastering AWS cloud computing, then your first stop should be to understand the AWS fundamentals. This will enable you to get a solid foundation to then expand your knowledge across the entire AWS service catalog.   It can be both d...

Read more
  • AWS
  • Compute
  • Database
  • fundamentals
  • networking
  • Security
  • Storage