DevOps in AWS: CloudFormation, CodeCommit, CodePipeline, and CodeDeploy

We are entering a new era of technology this is causing and a cultural shift in the way software projects are built. The waterfall model paved the way for agile development a few years back. Today a need exists for better collaboration between IT operations, developers, and IT Infrastructure engineers. This need brought about the new culture that we call DevOps.

DevOps is fundamentally about improved collaboration, integration, automation, reliable, and consistent IT operations. DevOps is an efficient journey from development to production. It requires the overlapping and intersecting responsibilities of developers and IT operations.

DevOps means:

  • Infrastructure as code
  • Continuous Deployment
  • Continuous Integration
  • Version Control Integration
  • Test Automation
  • Monitoring

The goal of this post is to introduce you to CloudFormation, CodeCommit, CodePipeline & CodeDeploy which fits separately into DevOps pipeline. I include other important tools and technologies such as OpsWorks, Cloudwatch, and Beanstalk because without them the discussion would be incomplete.

Infrastructure as a code with CloudFormation:

Treating infrastructure as code offers many benefits. It means creation, deployment, and maintenance of infrastructure in a programmatic, descriptive, and declarative way. It maintains the infrastructures’ version, putting in a defined format, maintaining the syntax and semantics helps in the long run for an IT operation engineer. When the code is maintained and deployed according to the above-mentioned practice, the expected result is a consistent and reliable environment.

Maintaining infrastructure as code is nothing new for the engineers who have worked with Opscode Chef, Puppet, Ansible, and Salt among others. A good example of this is cookbooks in Chef. This is what we like to call, Infrastructure as code. See below:

package 'nginx' do
  action :install
end
service 'nginx' do
  action [ :enable, :start ]
end
cookbook_file "/usr/share/nginx/www/index.html" do
  source "index.html"
  mode "0644"
end

Take a look at how easily an nginx server installation starts and displays the homepage in a defined code format. This is a cookbook in the Ruby language and maintained as if it were regular code. You can use version control software like git or svn to store it and make improvements.

Those are not new to AWS can easily find that from a very early time in cloud computing provides infrastructure as code in terms of CloudFormation. By using CloudFormation template (in JSON format) you can define and model AWS resources. They follow a defined syntax and maintained just like the way the DevOps principle demands. Take a look:

{
  "AWSTemplateFormatVersion" : "2010-09-09",
  "Description" : "AWS CloudFormation Sample Template Sample template EIP_With_Association: This template shows how to associate an Elastic IP address with an Amazon EC2 instance",
  "Parameters" : {
    "InstanceType" : {
      "Description" : "WebServer EC2 instance type",
      "Type" : "String",
      "Default" : " m2.xlarge",
      "AllowedValues" : ["m2.xlarge", "m2.2xlarge", "m3.xlarge", "m3.2xlarge"]
,
      "ConstraintDescription" : "must be a valid EC2 instance type."
    },
    "KeyName" : {
      "Description" : "Name of an existing EC2 KeyPair to enable SSH access to the instances",
      "Type" : "AWS::EC2::KeyPair::KeyName",
      "ConstraintDescription" : "must be the name of an existing EC2 KeyPair."
    },
    "SSHLocation" : {
      "Description" : "The IP address range that can be used to SSH to the EC2 instances",
      "Type": "String",
      "MinLength": "9",
      "MaxLength": "18",
      "Default": "0.0.0.0/0",
      "AllowedPattern": "(\\d{1,3})\\.(\\d{1,3})\\.(\\d{1,3})\\.(\\d{1,3})/(\\d{1,2})",
      "ConstraintDescription": "must be a valid IP CIDR range of the form x.x.x.x/x."
    }
  },
  "Mappings" : {
    "AWSInstanceType2Arch" : {
      "m2.xlarge"   : { "Arch" : "PV64"   },
      "m2.2xlarge"  : { "Arch" : "PV64"   },
      "m3.large"    : { "Arch" : "HVM64"  },
      "m3.2xlarge"  : { "Arch" : "HVM64"  }
    },
    "AWSInstanceType2NATArch" : {
      "m2.xlarge"   : { "Arch" : "NATPV64"   },
      "m2.2xlarge"  : { "Arch" : "NATPV64"   },
      "m3.xlarge"   : { "Arch" : "NATHVM64"  },
      "m3.2xlarge"  : { "Arch" : "NATHVM64"  }
    }
,
    "AWSRegionArch2AMI" : {
      "us-east-1"        : {"PV64" : "ami-5fb8c835", "HVM64" : "ami-60b6c60a", "HVMG2" : "ami-e998ea83"},
      "us-west-1"        : {"PV64" : "ami-56ea8636", "HVM64" : "ami-d5ea86b5", "HVMG2" : "ami-943956f4"}
    }
  },
  "Resources" : {
    "EC2Instance" : {
      "Type" : "AWS::EC2::Instance",
      "Properties" : {
        "UserData" : { "Fn::Base64" : { "Fn::Join" : [ "", [ "IPAddress=", {"Ref" : "IPAddress"}]]}},
        "InstanceType" : { "Ref" : "InstanceType" },
        "SecurityGroups" : [ { "Ref" : "InstanceSecurityGroup" } ],
        "KeyName" : { "Ref" : "KeyName" },
        "ImageId" : { "Fn::FindInMap" : [ "AWSRegionArch2AMI", { "Ref" : "AWS::Region" },
                          { "Fn::FindInMap" : [ "AWSInstanceType2Arch", { "Ref" : "InstanceType" }, "Arch" ] } ] }
      }
    },
    "InstanceSecurityGroup" : {
      "Type" : "AWS::EC2::SecurityGroup",
      "Properties" : {
        "GroupDescription" : "Enable SSH access",
        "SecurityGroupIngress" :
          [ { "IpProtocol" : "tcp", "FromPort" : "22", "ToPort" : "22", "CidrIp" : { "Ref" : "SSHLocation"} }]
      }
    },
    "IPAddress" : {
      "Type" : "AWS::EC2::EIP"
    },
    "IPAssoc" : {
      "Type" : "AWS::EC2::EIPAssociation",
      "Properties" : {
        "InstanceId" : { "Ref" : "EC2Instance" },
        "EIP" : { "Ref" : "IPAddress" }
      }
    }
  },
  "Outputs" : {
    "InstanceId" : {
      "Description" : "InstanceId of the newly created EC2 instance",
      "Value" : { "Ref" : "EC2Instance" }
    },
    "InstanceIPAddress" : {
      "Description" : "IP address of the newly created EC2 instance",
      "Value" : { "Ref" : "IPAddress" }
    }
  }
}

Here all the details required to launch an EC2 instance–coded and maintained. At first glance, it may not appear useful. But upon closer inspection, the code has every detail, what should be the size of the instance, what is the architecture we are looking for, the security group, the region we want to deploy etc. It is a very basic template. But imagine when you create a complete environment and you want to maintain the consistency and reliability, this kind of template will help you immensely at later point of time. No wonder, a PaaS platform like PCF on AWS can be completely deployed using a CloudFormation template.

You can also check the Cloud Academy blogs on:
1. Cloud Formation Deployment Automation
2. Cloud Formation Deployment Tool
3. Writing your first cloud formation template
4. Understanding nesting cloud formation stacks

for a better and detailed understanding of the topic.

Version Control Integration with CodeCommit:

AWS CodeCommit is the Git-as-a-Service in AWS. We have an introductory blog on CodeCommit which will be a useful starting point to get introduced to CodeCommit. A cloud-based source control system with features like HA, fully managed, can store anything, secure. It has virtually limitless storage and no repo size limit, secured with AWS IAM and data is replicated across AZs for durability.

Continuous Deployment with CodeDeploy:

Continuous Deployment is another principle of DevOps where the production-ready code is automatically deployed from version controlled system repository. AWS CodeDeploy helps to deploy codes in AWS EC2 with minimal downtime, with centralized deployment control and monitoring. Using CodeDeploy is easy and few step process which are:

  1. Create an AppSec file and package your application. AppSec file describes a series of steps that CodeDeploy will use to deploy.
  2. Set up your deployment environment. In this step, you define your DeploymentGroup such as AutoScaling Group and Install the agents in EC2 instances.
  3. Deploy.

A sample CLI command is as follows:

$aws deploy create-deployment --application-name test-app
            --deployment-group-name test-dg
            --s3-location bucket=test-bucket, key=myapp.zip

You can read about our blog posts on CodeDeploy from here.

Continuous Delivery and automation with CodePipeline:

AWS CodePipeline is Amazon’s offering for Continuous Delivery and release automation service.  AWS CodePipeline automates the release process for building the code, deploying to pre-production environments, testing your application and releasing it to production. Every time there is a code change, AWS CodePipeline builds, tests, and deploys the application according to the defined workflow.
AWS CodePipeline

(Image Courtesy: Amazon)

CodePipeline is flexible which allows you to integrate partner tools and your own custom tools into any stage of the release process. The partner tools are GitHub for Source Control, Solano Labs, Jenkins & CloudBees for Build, Continuous Delivery & Integration, Apica, Blazemaster, Ghost Inspector for testing and XebiaLabs for deployment.

Conclusion

AWS has always put the best effort to provide its users with the best tools and technologies. The effort is also made to make them simple, effortless and not very tough learning curve to use. Every new announcement, we are experiencing that AWS has made the cloud journey smooth, agile and efficient by embracing the DevOps technologies and principles in its platform.

Avatar

Written by

Chandan Patra

Cloud Computing and Big Data professional with 10 years of experience in pre-sales, architecture, design, build and troubleshooting with best engineering practices. Specialities: Cloud Computing - AWS, DevOps(Chef), Hadoop Ecosystem, Storm & Kafka, ELK Stack, NoSQL, Java, Spring, Hibernate, Web Service

Related Posts

Avatar
Jeremy Cook
— September 17, 2019

Cloud Migration Risks & Benefits

If you’re like most businesses, you already have at least one workload running in the cloud. However, that doesn’t mean that cloud migration is right for everyone. While cloud environments are generally scalable, reliable, and highly available, those won’t be the only considerations dri...

Read more
  • AWS
  • Azure
  • Cloud Migration
Joe Nemer
Joe Nemer
— September 12, 2019

Real-Time Application Monitoring with Amazon Kinesis

Amazon Kinesis is a real-time data streaming service that makes it easy to collect, process, and analyze data so you can get quick insights and react as fast as possible to new information.  With Amazon Kinesis you can ingest real-time data such as application logs, website clickstre...

Read more
  • amazon kinesis
  • AWS
  • Stream Analytics
  • Streaming data
Joe Nemer
Joe Nemer
— September 6, 2019

Google Cloud Functions vs. AWS Lambda: The Fight for Serverless Cloud Domination

Serverless computing: What is it and why is it important? A quick background The general concept of serverless computing was introduced to the market by Amazon Web Services (AWS) around 2014 with the release of AWS Lambda. As we know, cloud computing has made it possible for users to ...

Read more
  • AWS
  • Azure
  • Google Cloud Platform
Joe Nemer
Joe Nemer
— September 3, 2019

Google Vision vs. Amazon Rekognition: A Vendor-Neutral Comparison

Google Cloud Vision and Amazon Rekognition offer a broad spectrum of solutions, some of which are comparable in terms of functional details, quality, performance, and costs. This post is a fact-based comparative analysis on Google Vision vs. Amazon Rekognition and will focus on the tech...

Read more
  • Amazon Rekognition
  • AWS
  • Google Cloud Platform
  • Google Vision
Alisha Reyes
Alisha Reyes
— August 30, 2019

New on Cloud Academy: CISSP, AWS, Azure, & DevOps Labs, Python for Beginners, and more…

As Hurricane Dorian intensifies, it looks like Floridians across the entire state might have to hunker down for another big one. If you've gone through a hurricane, you know that preparing for one is no joke. You'll need a survival kit with plenty of water, flashlights, batteries, and n...

Read more
  • AWS
  • Azure
  • Google Cloud Platform
  • New content
  • Product Feature
  • Python programming
Joe Nemer
Joe Nemer
— August 27, 2019

Amazon Route 53: Why You Should Consider DNS Migration

What Amazon Route 53 brings to the DNS table Amazon Route 53 is a highly available and scalable Domain Name System (DNS) service offered by AWS. It is named by the TCP or UDP port 53, which is where DNS server requests are addressed. Like any DNS service, Route 53 handles domain regist...

Read more
  • Amazon
  • AWS
  • Cloud Migration
  • DNS
  • Route 53
Alisha Reyes
Alisha Reyes
— August 22, 2019

How to Unlock Complimentary Access to Cloud Academy

Are you looking to get trained or certified on AWS, Azure, Google Cloud Platform, DevOps, Cloud Security, Python, Java, or another technical skill? Then you'll want to mark your calendars for August 23, 2019. Starting Friday at 12:00 a.m. PDT (3:00 a.m. EDT), Cloud Academy is offering c...

Read more
  • AWS
  • Azure
  • cloud academy content
  • complimentary access
  • GCP
  • on the house
Avatar
Michael Sheehy
— August 19, 2019

What Exactly Is a Cloud Architect and How Do You Become One?

One of the buzzwords surrounding the cloud that I'm sure you've heard is "Cloud Architect." In this article, I will outline my understanding of what a cloud architect does and I'll analyze the skills and certifications necessary to become one. I will also list some of the types of jobs ...

Read more
  • AWS
  • Cloud Computing
Avatar
Nitheesh Poojary
— August 19, 2019

Boto: Using Python to Automate AWS Services

Boto allows you to write scripts to automate things like starting AWS EC2 instances Boto is a Python package that provides programmatic connectivity to Amazon Web Services (AWS). AWS offers a range of services for dynamically scaling servers including the core compute service, Elastic...

Read more
  • Automated AWS Services
  • AWS
  • Boto
  • Python
Avatar
Andrew Larkin
— August 13, 2019

Content Roadmap: AZ-500, ITIL 4, MS-100, Google Cloud Associate Engineer, and More

Last month, Cloud Academy joined forces with QA, the UK’s largest B2B skills provider, and it put us in an excellent position to solve a massive skills gap problem. As a result of this collaboration, you will see our training library grow with additions from QA’s massive catalog of 500+...

Read more
  • AWS
  • Azure
  • content roadmap
  • Google Cloud Platform
Avatar
Adam Hawkins
— August 9, 2019

DevSecOps: How to Secure DevOps Environments

Security has been a friction point when discussing DevOps. This stems from the assumption that DevOps teams move too fast to handle security concerns. This makes sense if Information Security (InfoSec) is separate from the DevOps value stream, or if development velocity exceeds the band...

Read more
  • AWS
  • cloud security
  • DevOps
  • DevSecOps
  • Security
Avatar
Stefano Giacone
— August 8, 2019

Test Your Cloud Knowledge on AWS, Azure, or Google Cloud Platform

Cloud skills are in demand | In today's digital era, employers are constantly seeking skilled professionals with working knowledge of AWS, Azure, and Google Cloud Platform. According to the 2019 Trends in Cloud Transformation report by 451 Research: Business and IT transformations re...

Read more
  • AWS
  • Cloud skills
  • Google Cloud
  • Microsoft Azure