Skip to main content

DevOps in AWS: CloudFormation, CodeCommit, CodePipeline, and CodeDeploy

We are entering a new era of technology this is causing and a cultural shift in the way software projects are built. The waterfall model paved the way for agile development a few years back. Today a need exists for better collaboration between IT operations, developers, and IT Infrastructure engineers. This need brought about the new culture that we call DevOps.

DevOps is fundamentally about improved collaboration, integration, automation, reliable, and consistent IT operations. DevOps is an efficient journey from development to production. It requires the overlapping and intersecting responsibilities of developers and IT operations.

DevOps means:

  • Infrastructure as code
  • Continuous Deployment
  • Continuous Integration
  • Version Control Integration
  • Test Automation
  • Monitoring

The goal of this post is to introduce you to CloudFormation, CodeCommit, CodePipeline & CodeDeploy which fits separately into DevOps pipeline. I include other important tools and technologies such as OpsWorks, Cloudwatch, and Beanstalk because without them the discussion would be incomplete.

Infrastructure as a code with CloudFormation:

Treating infrastructure as code offers many benefits. It means creation, deployment, and maintenance of infrastructure in a programmatic, descriptive, and declarative way. It maintains the infrastructures’ version, putting in a defined format, maintaining the syntax and semantics helps in the long run for an IT operation engineer. When the code is maintained and deployed according to the above-mentioned practice, the expected result is a consistent and reliable environment.

Maintaining infrastructure as code is nothing new for the engineers who have worked with Opscode Chef, Puppet, Ansible, and Salt among others. A good example of this is cookbooks in Chef. This is what we like to call, Infrastructure as code. See below:

package 'nginx' do
  action :install
end
service 'nginx' do
  action [ :enable, :start ]
end
cookbook_file "/usr/share/nginx/www/index.html" do
  source "index.html"
  mode "0644"
end

Take a look at how easily an nginx server installation starts and displays the homepage in a defined code format. This is a cookbook in the Ruby language and maintained as if it were regular code. You can use version control software like git or svn to store it and make improvements.

Those are not new to AWS can easily find that from a very early time in cloud computing provides infrastructure as code in terms of CloudFormation. By using CloudFormation template (in JSON format) you can define and model AWS resources. They follow a defined syntax and maintained just like the way the DevOps principle demands. Take a look:

{
  "AWSTemplateFormatVersion" : "2010-09-09",
  "Description" : "AWS CloudFormation Sample Template Sample template EIP_With_Association: This template shows how to associate an Elastic IP address with an Amazon EC2 instance",
  "Parameters" : {
    "InstanceType" : {
      "Description" : "WebServer EC2 instance type",
      "Type" : "String",
      "Default" : " m2.xlarge",
      "AllowedValues" : ["m2.xlarge", "m2.2xlarge", "m3.xlarge", "m3.2xlarge"]
,
      "ConstraintDescription" : "must be a valid EC2 instance type."
    },
    "KeyName" : {
      "Description" : "Name of an existing EC2 KeyPair to enable SSH access to the instances",
      "Type" : "AWS::EC2::KeyPair::KeyName",
      "ConstraintDescription" : "must be the name of an existing EC2 KeyPair."
    },
    "SSHLocation" : {
      "Description" : "The IP address range that can be used to SSH to the EC2 instances",
      "Type": "String",
      "MinLength": "9",
      "MaxLength": "18",
      "Default": "0.0.0.0/0",
      "AllowedPattern": "(\\d{1,3})\\.(\\d{1,3})\\.(\\d{1,3})\\.(\\d{1,3})/(\\d{1,2})",
      "ConstraintDescription": "must be a valid IP CIDR range of the form x.x.x.x/x."
    }
  },
  "Mappings" : {
    "AWSInstanceType2Arch" : {
      "m2.xlarge"   : { "Arch" : "PV64"   },
      "m2.2xlarge"  : { "Arch" : "PV64"   },
      "m3.large"    : { "Arch" : "HVM64"  },
      "m3.2xlarge"  : { "Arch" : "HVM64"  }
    },
    "AWSInstanceType2NATArch" : {
      "m2.xlarge"   : { "Arch" : "NATPV64"   },
      "m2.2xlarge"  : { "Arch" : "NATPV64"   },
      "m3.xlarge"   : { "Arch" : "NATHVM64"  },
      "m3.2xlarge"  : { "Arch" : "NATHVM64"  }
    }
,
    "AWSRegionArch2AMI" : {
      "us-east-1"        : {"PV64" : "ami-5fb8c835", "HVM64" : "ami-60b6c60a", "HVMG2" : "ami-e998ea83"},
      "us-west-1"        : {"PV64" : "ami-56ea8636", "HVM64" : "ami-d5ea86b5", "HVMG2" : "ami-943956f4"}
    }
  },
  "Resources" : {
    "EC2Instance" : {
      "Type" : "AWS::EC2::Instance",
      "Properties" : {
        "UserData" : { "Fn::Base64" : { "Fn::Join" : [ "", [ "IPAddress=", {"Ref" : "IPAddress"}]]}},
        "InstanceType" : { "Ref" : "InstanceType" },
        "SecurityGroups" : [ { "Ref" : "InstanceSecurityGroup" } ],
        "KeyName" : { "Ref" : "KeyName" },
        "ImageId" : { "Fn::FindInMap" : [ "AWSRegionArch2AMI", { "Ref" : "AWS::Region" },
                          { "Fn::FindInMap" : [ "AWSInstanceType2Arch", { "Ref" : "InstanceType" }, "Arch" ] } ] }
      }
    },
    "InstanceSecurityGroup" : {
      "Type" : "AWS::EC2::SecurityGroup",
      "Properties" : {
        "GroupDescription" : "Enable SSH access",
        "SecurityGroupIngress" :
          [ { "IpProtocol" : "tcp", "FromPort" : "22", "ToPort" : "22", "CidrIp" : { "Ref" : "SSHLocation"} }]
      }
    },
    "IPAddress" : {
      "Type" : "AWS::EC2::EIP"
    },
    "IPAssoc" : {
      "Type" : "AWS::EC2::EIPAssociation",
      "Properties" : {
        "InstanceId" : { "Ref" : "EC2Instance" },
        "EIP" : { "Ref" : "IPAddress" }
      }
    }
  },
  "Outputs" : {
    "InstanceId" : {
      "Description" : "InstanceId of the newly created EC2 instance",
      "Value" : { "Ref" : "EC2Instance" }
    },
    "InstanceIPAddress" : {
      "Description" : "IP address of the newly created EC2 instance",
      "Value" : { "Ref" : "IPAddress" }
    }
  }
}

Here all the details required to launch an EC2 instance–coded and maintained. At first glance, it may not appear useful. But upon closer inspection, the code has every detail, what should be the size of the instance, what is the architecture we are looking for, the security group, the region we want to deploy etc. It is a very basic template. But imagine when you create a complete environment and you want to maintain the consistency and reliability, this kind of template will help you immensely at later point of time. No wonder, a PaaS platform like PCF on AWS can be completely deployed using a CloudFormation template.

You can also check the Cloud Academy blogs on:
1. Cloud Formation Deployment Automation
2. Cloud Formation Deployment Tool
3. Writing your first cloud formation template
4. Understanding nesting cloud formation stacks

for a better and detailed understanding of the topic.

Version Control Integration with CodeCommit:

AWS CodeCommit is the Git-as-a-Service in AWS. We have an introductory blog on CodeCommit which will be a useful starting point to get introduced to CodeCommit. A cloud-based source control system with features like HA, fully managed, can store anything, secure. It has virtually limitless storage and no repo size limit, secured with AWS IAM and data is replicated across AZs for durability.

Continuous Deployment with CodeDeploy:

Continuous Deployment is another principle of DevOps where the production-ready code is automatically deployed from version controlled system repository. AWS CodeDeploy helps to deploy codes in AWS EC2 with minimal downtime, with centralized deployment control and monitoring. Using CodeDeploy is easy and few step process which are:

  1. Create an AppSec file and package your application. AppSec file describes a series of steps that CodeDeploy will use to deploy.
  2. Set up your deployment environment. In this step, you define your DeploymentGroup such as AutoScaling Group and Install the agents in EC2 instances.
  3. Deploy.

A sample CLI command is as follows:

$aws deploy create-deployment --application-name test-app
            --deployment-group-name test-dg
            --s3-location bucket=test-bucket, key=myapp.zip

You can read about our blog posts on CodeDeploy from here.

Continuous Delivery and automation with CodePipeline:

AWS CodePipeline is Amazon’s offering for Continuous Delivery and release automation service.  AWS CodePipeline automates the release process for building the code, deploying to pre-production environments, testing your application and releasing it to production. Every time there is a code change, AWS CodePipeline builds, tests, and deploys the application according to the defined workflow.
AWS CodePipeline

(Image Courtesy: Amazon)

CodePipeline is flexible which allows you to integrate partner tools and your own custom tools into any stage of the release process. The partner tools are GitHub for Source Control, Solano Labs, Jenkins & CloudBees for Build, Continuous Delivery & Integration, Apica, Blazemaster, Ghost Inspector for testing and XebiaLabs for deployment.

Conclusion

AWS has always put the best effort to provide its users with the best tools and technologies. The effort is also made to make them simple, effortless and not very tough learning curve to use. Every new announcement, we are experiencing that AWS has made the cloud journey smooth, agile and efficient by embracing the DevOps technologies and principles in its platform.

Avatar

Written by

Chandan Patra

Cloud Computing and Big Data professional with 10 years of experience in pre-sales, architecture, design, build and troubleshooting with best engineering practices. Specialities: Cloud Computing - AWS, DevOps(Chef), Hadoop Ecosystem, Storm & Kafka, ELK Stack, NoSQL, Java, Spring, Hibernate, Web Service

Related Posts

Alisha Reyes
Alisha Reyes
— July 22, 2019

Cloud Academy’s Blog Digest: July 2019

July has been a very exciting month for us at Cloud Academy. On July 10, we officially joined forces with QA, the UK’s largest B2B skills provider (read the announcement). Over the coming weeks, you will see additions from QA’s massive catalog of 500+ certification courses and 1500+ ins...

Read more
  • AWS
  • Azure
  • Cloud Academy
  • Cybersecurity
  • DevOps
  • Kubernetes
Avatar
Stuart Scott
— July 18, 2019

AWS Fundamentals: Understanding Compute, Storage, Database, Networking & Security

If you are just starting out on your journey toward mastering AWS cloud computing, then your first stop should be to understand the AWS fundamentals. This will enable you to get a solid foundation to then expand your knowledge across the entire AWS service catalog.   It can be both d...

Read more
  • AWS
  • Compute
  • Database
  • fundamentals
  • networking
  • Security
  • Storage
Avatar
Adam Hawkins
— July 17, 2019

How to Become a DevOps Engineer

The DevOps Handbook introduces DevOps as a framework for improving the process for converting a business hypothesis into a technology-enabled service that delivers value to the customer. This process is called the value stream. Accelerate finds that applying DevOps principles of flow, f...

Read more
  • AWS
  • AWS Certifications
  • DevOps
  • DevOps Foundation Certification
  • Engineer
  • Kubernetes
Avatar
Stuart Scott
— July 2, 2019

AWS Machine Learning Services

The speed at which machine learning (ML) is evolving within the cloud industry is exponentially growing, and public cloud providers such as AWS are releasing more and more services and feature updates to run in parallel with the trend and demand of this technology within organizations t...

Read more
  • Amazon Machine Learning
  • AWS
  • AWS re:Invent
  • Machine Learning
Avatar
Stuart Scott
— June 27, 2019

AWS Control Tower & VPC Traffic Mirroring

AWS re:Inforce 2019 is a two-day conference for security, identity, and compliance learning and community building. This year's keynote, presented by AWS Vice President and CIO, Stephen Schmidt, announced the general availability of AWS Control Tower and the new VPC Traffic Mirroring fe...

Read more
  • AWS
  • re:Inforce 2019
  • traffic mirroring
  • VPC
Avatar
Stuart Scott
— June 20, 2019

Working with AWS Networking & Amazon VPC

Being able to architect your own isolated segment of AWS is a simple process using VPCs; understanding how to architect its related networking components and connectivity architecture is key to making it a powerful service. Many services within Amazon Web Services (AWS) require you t...

Read more
  • AWS
  • VPC
Avatar
Stuart Scott
— June 19, 2019

AWS Compute Fundamentals Update

AWS is renowned for the rate at which it reinvents, revolutionizes, and meets customer demands and expectations through its continuous cycle of feature and service updates. With hundreds of updates a month, it can be difficult to stay on top of all the changes made available.   Here ...

Read more
  • AWS
Jeff Hyatt
Jeff Hyatt
— June 18, 2019

10 Steps for an Effective Reserved Instances Strategy

Amazon Web Services (AWS) offers three different ways to pay for EC2 Instances: On-Demand, Reserved Instances, and Spot Instances. This article will focus on effective strategies for purchasing Reserved Instances. While most of the major cloud platforms offer pre-pay and reservation dis...

Read more
  • AWS
  • EC2
Joe Nemer
Joe Nemer
— June 18, 2019

AWS Certification Practice Exam: What to Expect from Test Questions

If you’re building applications on the AWS cloud or looking to get started in cloud computing, certification is a way to build deep knowledge in key services unique to the AWS platform. AWS currently offers 11 certifications that cover major cloud roles including Solutions Architect, De...

Read more
  • AWS
  • AWS Certifications
Avatar
John Chell
— June 13, 2019

AWS Certified Solutions Architect Associate: A Study Guide

The AWS Solutions Architect - Associate Certification (or Sol Arch Associate for short) offers some clear benefits: Increases marketability to employers Provides solid credentials in a growing industry (with projected growth of as much as 70 percent in five years) Market anal...

Read more
  • AWS
  • AWS Certifications
Chris Gambino and Joe Niemiec
Chris Gambino and Joe Niemiec
— June 11, 2019

Moving Data to S3 with Apache NiFi

Moving data to the cloud is one of the cornerstones of any cloud migration. Apache NiFi is an open source tool that enables you to easily move and process data using a graphical user interface (GUI).  In this blog post, we will examine a simple way to move data to the cloud using NiFi c...

Read more
  • AWS
  • S3
Avatar
Chandan Patra
— June 11, 2019

Amazon DynamoDB: 10 Things You Should Know

Amazon DynamoDB is a managed NoSQL service with strong consistency and predictable performance that shields users from the complexities of manual setup. Whether or not you've actually used a NoSQL data store yourself, it's probably a good idea to make sure you fully understand the key ...

Read more
  • AWS
  • DynamoDB