How do you bind together tools like Jenkins, Junit, Maven, GitHub, and S3 – along with Docker and Dockerfiles – to make the continuous integration process actually work?
In an earlier post on continuous integration using Docker, I wrote about the ingredients you’ll need for effective Docker/Jenkins deployments. Now I’ll discuss how the continuous integration process actually works.
Continuous integration: the deployment process
The process starts when a developer checks out code from GitHub and adds a new feature. After launching a successful build on local dev systems, he (or she) will run a feature and performance test using Junit.
Once Junit is happy with the results, the developer pushes the new code back up to GitHub. Post-Hook (configured on GitHub) connects to the Jenkin server. Each check-in is then verified by an automated build, allowing teams to detect problems early. By integrating regularly, you can detect errors quickly, and locate them more easily.
Using the Jenkins GitHub Plugin, you can automatically trigger build jobs when pushes are made to GitHub.
The Jenkins server uses Maven to build its Java projects. Using Maven only requires that you create a pom.xml file and place your source code in the default directory. Maven will automatically take care of the rest:
Jenkin’s S3 plugin will copy build artifacts to an S3 bucket. The Jenkins Remote SSH plugin will log in to our Ec2 instance which Docker’s daemon installed and execute a Shell Script
This Shell Script contains Dockerfiles – Docker can build servers automatically by reading the instructions from a Dockerfile (a Dockerfile is a text document that contains all the commands you would normally execute manually in order to build a server image).
The Dockerfile updates the OS, installs Tomcat and Java, pulls the code from S3, deploys to our web apps directory, and, finally, restarts Tomcat.
Once QA approves the build, the Dev-Ops team uses chef-cookbook to deploy the code on to production Ec2 servers
Why introduce Docker into the continuous integration workflow
- In Developer environments, we want to remain as close as possible to production. We also want the development environment to be as fast as possible for interactive use.
- Running multiple applications on the developer machine can slow it down. With Docker, you can easily launch a fresh server for every build.
- Maintain one dedicated Docker container for each application.
- Before VMs, bringing up a new hardware resource took days. Virtualization brought this down to minutes. Docker, by creating just a container for the process and not booting up an OS, brings it down to seconds.
- Ops benefits: only one standardized Dev environment to support.
- Management benefits: faster time to market; happier engineers.
- Developer benefits: detect problems early. Regular integration lets you detect and locate errors quickly.
Advantages and Disadvantages of Microservices Architecture
What are microservices? Let's start our discussion by setting a foundation of what microservices are. Microservices are a way of breaking large software projects into loosely coupled modules, which communicate with each other through simple Application Programming Interfaces (APIs). ...
Docker vs. Virtual Machines: Differences You Should Know
What are the differences between Docker and virtual machines? In this article, we'll compare the differences and provide our insights to help you decide between the two. Before we get started discussing Docker vs. Virtual Machines comparisons, let us first explain the basics. What is ...
Top 20 Open Source Tools for DevOps Success
Open source tools perform a very specific task, and the source code is openly published for use or modification free of charge. I've written about DevOps multiple times on this blog. I reiterate the point that DevOps is not about specific tools. It's a philosophy for building and improv...
New on Cloud Academy, March ’18: Machine Learning on AWS and Azure, Docker in Depth, and more
Introduction to Machine Learning on AWS This is your quick-start guide for building and deploying with Amazon Machine Learning. By the end of this learning path, you will be able to apply supervised and unsupervised learning, ML algorithms, deep learning, and deep neural networks on AW...
New on Cloud Academy, January ’18: Security, Machine Learning, Containers, and more
LEARNING PATHS Introduction to Kubernetes Kubernetes allows you to deploy and manage containers at scale. Created by Google, and now supported by Azure, AWS, and Docker, Kubernetes is the container orchestration platform of choice for many deployments. For teams deploying containeri...
8 Hands-on Labs to Master Docker in the Enterprise
Docker containers are known for bringing a level of ease and portability to the process of developing and deploying applications. Where developers have embraced them for development and testing, enterprise DevOps professionals consider container technologies like Docker to be a strategi...
New on Cloud Academy, September ’17. Big Data, Security, and Containers
Explore the newest Learning Paths, Courses, and Hands-on Labs on Cloud Academy in September. Learning Paths and Courses Certified Big Data Specialty on AWS Solving problems and identifying opportunities starts with data. The ability to collect, store, retrieve, and analyze data me...
Mesosphere to Incorporate Kubernetes into DC/OS
The announcement that Mesosphere is going to incorporate Kubernetes into DC/OS has generated a fair amount of buzz in the industry, with the consensus landing largely on the side that this is a sign that Mesosphere is ceding to Google’s open source software. I have a different perspecti...
What is HashiCorp Vault? How to Secure Secrets Inside Microservices
Whether you are a developer or a system administrator, you will have to manage the issue of sharing "secrets" or secure information. In this context, a secret is any sensitive information that should be protected. For example, if lost or stolen, your passwords, database credentials, or...
How to Deploy Docker Containers on AWS Elastic Beanstalk Applications
In this post, we are going to look at how to deploy two Docker containers on AWS Elastic Beanstalk Applications. Today, Docker containers are being used by many companies in sophisticated microservice infrastructures. From a developer point of view, one of the biggest benefits of Do...
Docker Webinar Part 3: Production & Beyond
Last week, we wrapped up our three-part Docker webinar series. You can watch the Docker Webinar session on the webinars page and find the slides on Speakerdeck. Docker Webinar part one introduced Docker, container technologies, and how to get started in your development environment. It ...
Docker deployment – Webinar Series Part 2: From Dev to Production
Docker deployment: I recently held Part 2 of a three-part webinar series on Docker. For those of you who could not attend, this post summarizes the main topics that we covered. It also includes some additional items based on the QA session. You can watch part 2 and read the detailed QA...