Skip to main content

DevOps in AWS: CloudFormation, CodeCommit, CodePipeline, and CodeDeploy

We are entering a new era of technology this is causing and a cultural shift in the way software projects are built. The waterfall model paved the way for agile development a few years back. Today a need exists for better collaboration between IT operations, developers, and IT Infrastructure engineers. This need brought about the new culture that we call DevOps.

DevOps is fundamentally about improved collaboration, integration, automation, reliable, and consistent IT operations. DevOps is an efficient journey from development to production. It requires the overlapping and intersecting responsibilities of developers and IT operations.

DevOps means:

  • Infrastructure as code
  • Continuous Deployment
  • Continuous Integration
  • Version Control Integration
  • Test Automation
  • Monitoring

The goal of this post is to introduce you to CloudFormation, CodeCommit, CodePipeline & CodeDeploy which fits separately into DevOps pipeline. I include other important tools and technologies such as OpsWorks, Cloudwatch, and Beanstalk because without them the discussion would be incomplete.

Infrastructure as a code with CloudFormation:

Treating infrastructure as code offers many benefits. It means creation, deployment, and maintenance of infrastructure in a programmatic, descriptive, and declarative way. It maintains the infrastructures’ version, putting in a defined format, maintaining the syntax and semantics helps in the long run for an IT operation engineer. When the code is maintained and deployed according to the above-mentioned practice, the expected result is a consistent and reliable environment.

Maintaining infrastructure as code is nothing new for the engineers who have worked with Opscode Chef, Puppet, Ansible, and Salt among others. A good example of this is cookbooks in Chef. This is what we like to call, Infrastructure as code. See below:

package 'nginx' do
  action :install
end
service 'nginx' do
  action [ :enable, :start ]
end
cookbook_file "/usr/share/nginx/www/index.html" do
  source "index.html"
  mode "0644"
end

Take a look at how easily an nginx server installation starts and displays the homepage in a defined code format. This is a cookbook in the Ruby language and maintained as if it were regular code. You can use version control software like git or svn to store it and make improvements.

Those are not new to AWS can easily find that from a very early time in cloud computing provides infrastructure as code in terms of CloudFormation. By using CloudFormation template (in JSON format) you can define and model AWS resources. They follow a defined syntax and maintained just like the way the DevOps principle demands. Take a look:

{
  "AWSTemplateFormatVersion" : "2010-09-09",
  "Description" : "AWS CloudFormation Sample Template Sample template EIP_With_Association: This template shows how to associate an Elastic IP address with an Amazon EC2 instance",
  "Parameters" : {
    "InstanceType" : {
      "Description" : "WebServer EC2 instance type",
      "Type" : "String",
      "Default" : " m2.xlarge",
      "AllowedValues" : ["m2.xlarge", "m2.2xlarge", "m3.xlarge", "m3.2xlarge"]
,
      "ConstraintDescription" : "must be a valid EC2 instance type."
    },
    "KeyName" : {
      "Description" : "Name of an existing EC2 KeyPair to enable SSH access to the instances",
      "Type" : "AWS::EC2::KeyPair::KeyName",
      "ConstraintDescription" : "must be the name of an existing EC2 KeyPair."
    },
    "SSHLocation" : {
      "Description" : "The IP address range that can be used to SSH to the EC2 instances",
      "Type": "String",
      "MinLength": "9",
      "MaxLength": "18",
      "Default": "0.0.0.0/0",
      "AllowedPattern": "(\\d{1,3})\\.(\\d{1,3})\\.(\\d{1,3})\\.(\\d{1,3})/(\\d{1,2})",
      "ConstraintDescription": "must be a valid IP CIDR range of the form x.x.x.x/x."
    }
  },
  "Mappings" : {
    "AWSInstanceType2Arch" : {
      "m2.xlarge"   : { "Arch" : "PV64"   },
      "m2.2xlarge"  : { "Arch" : "PV64"   },
      "m3.large"    : { "Arch" : "HVM64"  },
      "m3.2xlarge"  : { "Arch" : "HVM64"  }
    },
    "AWSInstanceType2NATArch" : {
      "m2.xlarge"   : { "Arch" : "NATPV64"   },
      "m2.2xlarge"  : { "Arch" : "NATPV64"   },
      "m3.xlarge"   : { "Arch" : "NATHVM64"  },
      "m3.2xlarge"  : { "Arch" : "NATHVM64"  }
    }
,
    "AWSRegionArch2AMI" : {
      "us-east-1"        : {"PV64" : "ami-5fb8c835", "HVM64" : "ami-60b6c60a", "HVMG2" : "ami-e998ea83"},
      "us-west-1"        : {"PV64" : "ami-56ea8636", "HVM64" : "ami-d5ea86b5", "HVMG2" : "ami-943956f4"}
    }
  },
  "Resources" : {
    "EC2Instance" : {
      "Type" : "AWS::EC2::Instance",
      "Properties" : {
        "UserData" : { "Fn::Base64" : { "Fn::Join" : [ "", [ "IPAddress=", {"Ref" : "IPAddress"}]]}},
        "InstanceType" : { "Ref" : "InstanceType" },
        "SecurityGroups" : [ { "Ref" : "InstanceSecurityGroup" } ],
        "KeyName" : { "Ref" : "KeyName" },
        "ImageId" : { "Fn::FindInMap" : [ "AWSRegionArch2AMI", { "Ref" : "AWS::Region" },
                          { "Fn::FindInMap" : [ "AWSInstanceType2Arch", { "Ref" : "InstanceType" }, "Arch" ] } ] }
      }
    },
    "InstanceSecurityGroup" : {
      "Type" : "AWS::EC2::SecurityGroup",
      "Properties" : {
        "GroupDescription" : "Enable SSH access",
        "SecurityGroupIngress" :
          [ { "IpProtocol" : "tcp", "FromPort" : "22", "ToPort" : "22", "CidrIp" : { "Ref" : "SSHLocation"} }]
      }
    },
    "IPAddress" : {
      "Type" : "AWS::EC2::EIP"
    },
    "IPAssoc" : {
      "Type" : "AWS::EC2::EIPAssociation",
      "Properties" : {
        "InstanceId" : { "Ref" : "EC2Instance" },
        "EIP" : { "Ref" : "IPAddress" }
      }
    }
  },
  "Outputs" : {
    "InstanceId" : {
      "Description" : "InstanceId of the newly created EC2 instance",
      "Value" : { "Ref" : "EC2Instance" }
    },
    "InstanceIPAddress" : {
      "Description" : "IP address of the newly created EC2 instance",
      "Value" : { "Ref" : "IPAddress" }
    }
  }
}

Here all the details required to launch an EC2 instance–coded and maintained. At first glance, it may not appear useful. But upon closer inspection, the code has every detail, what should be the size of the instance, what is the architecture we are looking for, the security group, the region we want to deploy etc. It is a very basic template. But imagine when you create a complete environment and you want to maintain the consistency and reliability, this kind of template will help you immensely at later point of time. No wonder, a PaaS platform like PCF on AWS can be completely deployed using a CloudFormation template.

You can also check the Cloud Academy blogs on:
1. Cloud Formation Deployment Automation
2. Cloud Formation Deployment Tool
3. Writing your first cloud formation template
4. Understanding nesting cloud formation stacks

for a better and detailed understanding of the topic.

Version Control Integration with CodeCommit:

AWS CodeCommit is the Git-as-a-Service in AWS. We have an introductory blog on CodeCommit which will be a useful starting point to get introduced to CodeCommit. A cloud-based source control system with features like HA, fully managed, can store anything, secure. It has virtually limitless storage and no repo size limit, secured with AWS IAM and data is replicated across AZs for durability.

Continuous Deployment with CodeDeploy:

Continuous Deployment is another principle of DevOps where the production-ready code is automatically deployed from version controlled system repository. AWS CodeDeploy helps to deploy codes in AWS EC2 with minimal downtime, with centralized deployment control and monitoring. Using CodeDeploy is easy and few step process which are:

  1. Create an AppSec file and package your application. AppSec file describes a series of steps that CodeDeploy will use to deploy.
  2. Set up your deployment environment. In this step, you define your DeploymentGroup such as AutoScaling Group and Install the agents in EC2 instances.
  3. Deploy.

A sample CLI command is as follows:

$aws deploy create-deployment --application-name test-app
            --deployment-group-name test-dg
            --s3-location bucket=test-bucket, key=myapp.zip

You can read about our blog posts on CodeDeploy from here.

Continuous Delivery and automation with CodePipeline:

AWS CodePipeline is Amazon’s offering for Continuous Delivery and release automation service.  AWS CodePipeline automates the release process for building the code, deploying to pre-production environments, testing your application and releasing it to production. Every time there is a code change, AWS CodePipeline builds, tests, and deploys the application according to the defined workflow.
AWS CodePipeline

(Image Courtesy: Amazon)

CodePipeline is flexible which allows you to integrate partner tools and your own custom tools into any stage of the release process. The partner tools are GitHub for Source Control, Solano Labs, Jenkins & CloudBees for Build, Continuous Delivery & Integration, Apica, Blazemaster, Ghost Inspector for testing and XebiaLabs for deployment.

Conclusion

AWS has always put the best effort to provide its users with the best tools and technologies. The effort is also made to make them simple, effortless and not very tough learning curve to use. Every new announcement, we are experiencing that AWS has made the cloud journey smooth, agile and efficient by embracing the DevOps technologies and principles in its platform.

Avatar

Written by

Chandan Patra

Cloud Computing and Big Data professional with 10 years of experience in pre-sales, architecture, design, build and troubleshooting with best engineering practices.Specialities: Cloud Computing - AWS, DevOps(Chef), Hadoop Ecosystem, Storm & Kafka, ELK Stack, NoSQL, Java, Spring, Hibernate, Web Service

Related Posts

Sam Ghardashem
Sam Ghardashem
— May 15, 2019

Aviatrix Integration of a NextGen Firewall in AWS Transit Gateway

Learn how Aviatrix’s intelligent orchestration and control eliminates unwanted tradeoffs encountered when deploying Palo Alto Networks VM-Series Firewalls with AWS Transit Gateway.Deploying any next generation firewall in a public cloud environment is challenging, not because of the f...

Read more
  • AWS
Joe Nemer
Joe Nemer
— May 3, 2019

AWS Config Best Practices for Compliance

Use AWS Config the Right Way for Successful ComplianceIt’s well-known that AWS Config is a powerful service for monitoring all changes across your resources. As AWS Config has constantly evolved and improved over the years, it has transformed into a true powerhouse for monitoring your...

Read more
  • AWS
  • Compliance
Avatar
Francesca Vigliani
— April 30, 2019

Cloud Academy is Coming to the AWS Summits in Atlanta, London, and Chicago

Cloud Academy is a proud sponsor of the 2019 AWS Summits in Atlanta, London, and Chicago. We hope you plan to attend these free events that bring the cloud computing community together to connect, collaborate, and learn about AWS. These events are all about learning. You can learn how t...

Read more
  • AWS
  • AWS Summits
Paul Hortop
Paul Hortop
— April 2, 2019

How to Monitor Your AWS Infrastructure

The AWS cloud platform has made it easier than ever to be flexible, efficient, and cost-effective. However, monitoring your AWS infrastructure is the key to getting all of these benefits. Realizing these benefits requires that you follow AWS best practices which constantly change as AWS...

Read more
  • AWS
  • Monitoring
Joe Nemer
Joe Nemer
— April 1, 2019

AWS EC2 Instance Types Explained

Amazon Web Services’ resource offerings are constantly changing, and staying on top of their evolution can be a challenge. Elastic Cloud Compute (EC2) instances are one of their core resource offerings, and they form the backbone of most cloud deployments. EC2 instances provide you with...

Read more
  • AWS
  • EC2
Avatar
Nitheesh Poojary
— March 26, 2019

How DNS Works – the Domain Name System (Part One)

Before migrating domains to Amazon's Route53, we should first make sure we properly understand how DNS worksWhile we'll get to AWS's Route53 Domain Name System (DNS) service in the second part of this series, I thought it would be helpful to first make sure that we properly understand...

Read more
  • AWS
Avatar
Stuart Scott
— March 14, 2019

Multiple AWS Account Management using AWS Organizations

As businesses expand their footprint on AWS and utilize more services to build and deploy their applications, it becomes apparent that multiple AWS accounts are required to manage the environment and infrastructure.  A multi-account strategy is beneficial for a number of reasons as ...

Read more
  • AWS
  • Identity Access Management
Avatar
Sanket Dangi
— February 11, 2019

WaitCondition Controls the Pace of AWS CloudFormation Templates

AWS's WaitCondition can be used with CloudFormation templates to ensure required resources are running.As you may already be aware, AWS CloudFormation is used for infrastructure automation by allowing you to write JSON templates to automatically install, configure, and bootstrap your ...

Read more
  • AWS
  • CloudFormation
Badrinath Venkatachari
Badrinath Venkatachari
— February 1, 2019

10 Common AWS Mistakes & How to Avoid Them

Massive migration to the public cloud is changing architecture patterns, operating principles, and governance models. That means new approaches are vital to get a handle on soaring cloud spend. Because the cloud’s short-term billing cycles call for financial discipline, you must empower...

Read more
  • AWS
  • Operations
Avatar
Andrew Larkin
— January 24, 2019

The 9 AWS Certifications: Which is Right for You and Your Team?

As companies increasingly shift workloads to the public cloud, cloud computing has moved from a nice-to-have to a core competency in the enterprise. This shift requires a new set of skills to design, deploy, and manage applications in cloud computing.As the market leader and most ma...

Read more
  • AWS
  • AWS Certifications
Avatar
Andrew Larkin
— January 15, 2019

2018 Was a Big Year for Content at Cloud Academy

As Head of Content at Cloud Academy I work closely with our customers and my domain leads to prioritize quarterly content plans that will achieve the best outcomes for our customers.We started 2018 with two content objectives: To show customer teams how to use Cloud Services to solv...

Read more
  • AWS
  • Azure
  • Cloud Computing
  • Google Cloud Platform
Avatar
Jeremy Cook
— November 29, 2018

Amazon Elastic Inference – GPU Acceleration for Faster Inferencing

“Add GPU acceleration to any Amazon EC2 instance for faster inference at much lower cost (up to 75% savings)”So you’ve just kicked off the training phase of your multilayered deep neural network. The training phase is leveraging Amazon EC2 P3 instances to keep the training time to a...

Read more
  • AWS
  • Elastic Inference
  • re:Invent 2018