Skip to main content

Amazon EC2 Container Service and Elastic Beanstalk: Docker on AWS

Technology keeps moving. In just a few years, we’ve gone from servers running on dedicated hardware, through virtualization, and then cloud computing. And now we’ve reached the container age. As we will see, the Amazon EC2 Container Service (ECS) has made containers a major element of their deployment family. We’ll soon discuss three ways to run Docker containers on AWS, but first a few words about why you would want to.

Traditional servers living in data centers could do one thing at a time. There would often have been one server for Active Directory, another for Exchange, and a third running a database. But it was common for any or all of these servers to have unused excess processing capacity going to waste. Virtualization was developed to make use of that extra capacity. Through virtualization, a single physical server could host multiple virtual servers, allowing for far greater scalability and more efficient resource utilization.

But even virtualization required significant up-front hardware (and software license) investments and couldn’t completely remove the risks of excess capacity. That’s where cloud computing – with its multi-tenancy models, on-demand pricing, and dynamic scalability – proved so powerful.

But modern DevOps practices demand the ability to quickly build servers and ship code to be run in different environments. Welcome to the world of containers: extremely lightweight, abstracted user-space instances that can be easily launched on any compatible server and reliably provide a predictable experience. Containers – of which the best-known example is Docker – are able to achieve all this by capitalizing on the ability to share the Linux kernel of any host.

Docker

The DevOps development process depends, in large part, on portability. Even a slight difference between your development, test, and production environments may completely break your application. Traditional development models follow a change management process to solve these kinds of problems. But this process won’t fit in today’s rapid build and deploy cycles. Docker streamlines and automates the software development process. You can simply package an application with references to all its dependencies into standardized containers (defined by plain-text Dockerfiles) that include everything needed to run your application wherever you end up shipping it.

Docker uses a client-server architecture. The docker client talks to a host daemon which both builds and then runs the container.

In this post, we won’t talk all that much about running Docker natively. Instead, we will concentrate on AWS deployments. We’ll explore three ways to do that:

  1. Deploying Docker containers directly from an Ec2 instance.
  2. Using Docker containers on Elastic Beanstalk.
  3. Docker cluster management using the AWS EC2 Container Service.

Deploying Docker containers on an Ec2 instance

Docker can run anywhere, on a racked server, an old laptop, and perhaps, if you worked at it hard enough, even on a smartphone. So, since it requires only the Linux kernel to run, there’s definitely no reason why it shouldn’t work on a Linux-powered Amazon EC2 instance. Let’s see how we can install and run Docker on Amazon Linux.

    1. Launch an Amazon Linux Instance.
    2. Install the docker engine.
      sudo yum install -y docker
    3. SSH into your EC2 instance and start the Docker service.
      sudo service docker start
    4. Sign up for a Docker Hub Account.
    5. Search for Ubuntu OS in Docker Hub.
      sudo docker search ubuntu
    6. Download the ubuntu image.
      sudo docker pull
    7. start the container. The -t and -i flags allocate a pseudo-tty and keep stdin open.
      sudo docker run -t -i ubuntu /bin/bash
    8. At the regular BASH prompt (not in the Docker shell), install the Apache webserver.
      apt-get update && apt-get install apache2
    9. Save the container. We need to commit these changes to our container using the container ID and the image name.
      docker commit a2d424f5655ea nitheesh86/apache
    10. Push the changes to docker hub.
      docker push nitheesh86/apache

At this point, sharing the container with others is easy. Just run this command on any other server, anywhere on the Internet:

docker run -d -p 80:80 nitheesh86/apache /usr/sbin/apache2ctl -D FOREGROUND

Using Docker containers on Elastic Beanstalk

Elastic Beanstalk is an AWS PaaS platform. All you have to do is upload your application code, and Elastic Beanstalk takes care of the deployment, load balancing, and capacity provisioning. With a single click, you can start all the necessary application servers running. There is no charge for the Elastic Beanstalk service itself, just for the AWS resources you actually use.

EC2 Container Service Step-by-Step Update Process
With a single click, you can start all the necessary application servers running.

Since AWS integrated Docker into Elastic Beanstalk, you can build and test your application on your local workstation, and then directly deploy it on Elastic Beanstalk. As always, you install all the required software and dependencies through the Dockerfile. This file will include the image to be used (Ubuntu, CentOS, etc), and the volume to be mapped. Any new EC2 instance Elastic Beanstalk will launch to run your application will be configured based on the commands passed by your Dockerfile.

Let’s deploy a sample application on a Docker container

  1. Create a Dockerrun.aws.json file to deploy an existing Docker image. This file describes how to deploy a Docker container as an Elastic Beanstalk application. Your dockerrun.aws.json file should look something like this:

    Dockerrun.aws.json fileLet me explain this, line by line.

    • AWSEBDockerrunVersion – Specifies the version number as the value “1” for Single Container Docker environments.
    • Image – Specifies the Docker base image on an existing Docker repository from which you’re building your Docker container.
    • Ports – Lists the ports to expose on the Docker container.
    • Volumes – Maps volumes from an EC2 instance to your Docker container.
    • Logging – Maps the log directory inside the container.
  2. Create a zip file including Dockerfile or Dockerrun.aws.json and any application file, then upload to Elastic Beanstalk.
  3. From there you can follow this guide deploy your application.

Docker cluster management using the EC2 Container service (ECS)

When your container grows from one to many, maintenance can become a significant burden. Managing containers at scale requires proper cluster management software. Since “Amazon” and “scale” are like two sides of the same coin, it wasn’t much of a surprise when they created the AWS EC2 Container Service to handle the installation, operation, scaling and general cluster management for you.

Container instances communicate with the ECS service through an “ECS agent.” When you create a new cluster, an Amazon ECS-enabled AMI will be copied into your cluster.

EC2 Container Service Benefits

  • Simplified Cluster Management
    ECS creates and manages clusters of Docker Containers. It can scale from one to thousands of containers, spread across multiple Availability Zones.
  • Easy Scheduling
    ECS supports its own scheduler to spread containers out across your cluster, to balance availability and utilization.
  • Portability
    ECS uses the standard Docker daemon so that you can easily move your on-premises application to the AWS cloud and vice versa.
  • Resource Utilization
    A containerized application can make very efficient use of resources. You can choose to run multiple containers on the same EC2 instance.
  • Integrated with AWS services
    You can access most AWS features, such as Elastic IP addresses VPC, ELB, and EBS.

Defining Amazon EC2 Container Service Building Blocks

Diagram Showing Amazon EC2 Container Service Building Blocks 
  • Cluster
    A group of EC2 instances managed by ECS that lives inside your VPC. One cluster can contain multiple instance types and sizes and can be stretched across multiple Availability Zones.
  • Scheduler
    An integral part of each cluster. The scheduler is responsible for assigning containers to instances.
  • Container
    The virtual machines that execute your application. You can run any number of containers in a single cluster.
  • Task Definition
    Blue prints for your application that define the way your containers will work.
  • ECS AMI
    An Amazon Machine Image that includes the ECS Agent.

Launch a simple EC2 Container Service container

Create a cluster
Create task definitions. To get used to the process the first time through, I would suggest you stick with the sample task definition that’s provided.
Create an Amazon ECS sample task Create a task definition from the sample. You have the option of modifying the parameters (CPU resources or change the port mappings) in the task definition.

EC2 Container Service - Defining Parameters
EC2 Container Service Naming Containers

You can schedule a task to run once (which is ideal for batch jobs) or create a service to launch and maintain a specified number of copies of the task definition in your cluster. So that the sample application we are running will run continuously, let’s choose “Create a Service.”
EC2 Container Service Scheduling Tasks
Here’s the final step to launch our containers. The more instances you have in our cluster, the more tasks we can give them. Click Review, and Launch.
EC2 Container Service Configuring Clusters
Click on your cluster and at the bottom, you’ll see an external link for your application.
EC2 Container Service Cluster External Link
When you click on that link you should be able to see your sample application running in the browser.
EC2 Container Service Sample App Successfully Created

Avatar

Written by

Nitheesh Poojary

My professional IT career began nine years back when I was just out of my college. I worked with a great team as an infrastructure management engineer, managing hundreds of enterprise application servers. I found my passion when I got the opportunity to work with Cloud technologies: I'm addicted to AWS Cloud Services, DevOps engineering, and all the cloud tools and technologies that make engineers' lives easier. Currently, I am working as a Solution Architect in SixNines IT. We are an experienced team of engineers that have helped hundreds of customers move to the cloud responsibly. I have achieved 5 AWS certifications, happily helping fellow engineers across the globe through my blogs and answering questions in various forums.

Related Posts

Alisha Reyes
Alisha Reyes
— July 22, 2019

Cloud Academy’s Blog Digest: July 2019

July has been a very exciting month for us at Cloud Academy. On July 10, we officially joined forces with QA, the UK’s largest B2B skills provider (read the announcement). Over the coming weeks, you will see additions from QA’s massive catalog of 500+ certification courses and 1500+ ins...

Read more
  • AWS
  • Azure
  • Cloud Academy
  • Cybersecurity
  • DevOps
  • Kubernetes
Avatar
Stuart Scott
— July 18, 2019

AWS Fundamentals: Understanding Compute, Storage, Database, Networking & Security

If you are just starting out on your journey toward mastering AWS cloud computing, then your first stop should be to understand the AWS fundamentals. This will enable you to get a solid foundation to then expand your knowledge across the entire AWS service catalog.   It can be both d...

Read more
  • AWS
  • Compute
  • Database
  • fundamentals
  • networking
  • Security
  • Storage
Avatar
Adam Hawkins
— July 17, 2019

How to Become a DevOps Engineer

The DevOps Handbook introduces DevOps as a framework for improving the process for converting a business hypothesis into a technology-enabled service that delivers value to the customer. This process is called the value stream. Accelerate finds that applying DevOps principles of flow, f...

Read more
  • AWS
  • AWS Certifications
  • DevOps
  • DevOps Foundation Certification
  • Engineer
  • Kubernetes
Avatar
Stuart Scott
— July 2, 2019

AWS Machine Learning Services

The speed at which machine learning (ML) is evolving within the cloud industry is exponentially growing, and public cloud providers such as AWS are releasing more and more services and feature updates to run in parallel with the trend and demand of this technology within organizations t...

Read more
  • Amazon Machine Learning
  • AWS
  • AWS re:Invent
  • Machine Learning
Avatar
Stuart Scott
— June 27, 2019

AWS Control Tower & VPC Traffic Mirroring

AWS re:Inforce 2019 is a two-day conference for security, identity, and compliance learning and community building. This year's keynote, presented by AWS Vice President and CIO, Stephen Schmidt, announced the general availability of AWS Control Tower and the new VPC Traffic Mirroring fe...

Read more
  • AWS
  • re:Inforce 2019
  • traffic mirroring
  • VPC
Avatar
Stuart Scott
— June 20, 2019

Working with AWS Networking & Amazon VPC

Being able to architect your own isolated segment of AWS is a simple process using VPCs; understanding how to architect its related networking components and connectivity architecture is key to making it a powerful service. Many services within Amazon Web Services (AWS) require you t...

Read more
  • AWS
  • VPC
Avatar
Stuart Scott
— June 19, 2019

AWS Compute Fundamentals Update

AWS is renowned for the rate at which it reinvents, revolutionizes, and meets customer demands and expectations through its continuous cycle of feature and service updates. With hundreds of updates a month, it can be difficult to stay on top of all the changes made available.   Here ...

Read more
  • AWS
Jeff Hyatt
Jeff Hyatt
— June 18, 2019

10 Steps for an Effective Reserved Instances Strategy

Amazon Web Services (AWS) offers three different ways to pay for EC2 Instances: On-Demand, Reserved Instances, and Spot Instances. This article will focus on effective strategies for purchasing Reserved Instances. While most of the major cloud platforms offer pre-pay and reservation dis...

Read more
  • AWS
  • EC2
Joe Nemer
Joe Nemer
— June 18, 2019

AWS Certification Practice Exam: What to Expect from Test Questions

If you’re building applications on the AWS cloud or looking to get started in cloud computing, certification is a way to build deep knowledge in key services unique to the AWS platform. AWS currently offers 11 certifications that cover major cloud roles including Solutions Architect, De...

Read more
  • AWS
  • AWS Certifications
Avatar
John Chell
— June 13, 2019

AWS Certified Solutions Architect Associate: A Study Guide

The AWS Solutions Architect - Associate Certification (or Sol Arch Associate for short) offers some clear benefits: Increases marketability to employers Provides solid credentials in a growing industry (with projected growth of as much as 70 percent in five years) Market anal...

Read more
  • AWS
  • AWS Certifications
Chris Gambino and Joe Niemiec
Chris Gambino and Joe Niemiec
— June 11, 2019

Moving Data to S3 with Apache NiFi

Moving data to the cloud is one of the cornerstones of any cloud migration. Apache NiFi is an open source tool that enables you to easily move and process data using a graphical user interface (GUI).  In this blog post, we will examine a simple way to move data to the cloud using NiFi c...

Read more
  • AWS
  • S3
Avatar
Chandan Patra
— June 11, 2019

Amazon DynamoDB: 10 Things You Should Know

Amazon DynamoDB is a managed NoSQL service with strong consistency and predictable performance that shields users from the complexities of manual setup. Whether or not you've actually used a NoSQL data store yourself, it's probably a good idea to make sure you fully understand the key ...

Read more
  • AWS
  • DynamoDB