Docker is a relatively new open platform for building, shipping, and running distributed applications. Initially, it was mainly used for the creation of development environments, allowing applications to be easily tested in controlled, reproducible environments. More recently, as people got a better feel for what it could do, it’s also being used for continuous integration, Platform as a Service (PaaS), and production deployments.
In this blog post I will discuss the ingredients needed for effective continuous integration and deployment using Jenkins and Docker. In a later post, we’ll talk about the process itself.
CI is an organizational practice that aims to improve software quality and development speed by applying regular and automated unit tests against new code. Using a version control system, many development teams will regularly push new code from a project’s branches back to the main branch, allowing tested code to be quickly merged with the project and verified as deployable. A popular unit testing framework for Java code is JUNIT.
Continuous deployment is an automated process that ensures your application is always ready to deploy to production or development environments. By using both continuous integration and continuous deployment, development teams can always be ready to quickly deploy reliable builds and patches.
- Jenkins is an open source continuous integration server.
- Docker containers allow developers and system administrators to quickly and easily port applications with all of their dependencies and get them running across systems and machines.
- Docker files are scripts run within the Docker environment to customize and configure new containers at launch time.
- Amazon EC2 Instances are used to host multiple Docker containers.
- Amazon S3 buckets are used to store build artifacts.
- The Git Source Code Management plugin for Jenkins enables the use of Git as a build SCM tool.
- Post Steps sends files or executes commands on a remote server using SSH.
- Publish Artifacts to S3 – copy build artifacts to a specified S3 Bucket.
How does Docker work
- Docker was designed using Linux containers (LXC). LXC is an operating system-level virtualization tool for running multiple isolated server installs (containers) on a single control host. The main difference between KVM virtualization and Linux Containers is that virtual machines require a separate kernel instance to run on, while containers can share the host operating system kernel. It is similar to a chroot but offers much more isolation. Docker works much like a virtual machine (If you want to deepen your understanding of how Docker works in this short lecture you’ll see how is Docker different from VMs), wrapping everything (file system, process management, environment variables, etc.) into a container. Docker really does let you “Build once, configure once, and run anywhere.”
Docker Architecture Components
- File system A container can only access its own sandbox file system.
- Users namespace A container has its own user databases, which means a container’s root is not the same as the host’s root account.
- Process namespace Processes within a container cannot access or see processes in the host machine or other containers.
- Network namespace A container gets its own virtual network device and IP address.
Common use cases of Docker
- Automating the packaging and deployment of applications.
- Creation of lightweight, private PAAS environments.
- Automated testing and continuous integration/deployment.
- Deploying and scaling web apps, databases and backend services.
- Sharing your containers through the Docker index.
Continuous integration: conclusion
So far we covered terminology and players involved in the process of continuous integration using Docker and Jenkin. In a future post, we will explain practical workflow and processes and how we can use them with docker.
8 Surprising Ways Cloud Computing Is Changing Education
Cloud computing: Empowering the education industry Over the years, the education industry has come a long way. Teaching and learning are no longer confined to textbooks and classrooms and now reaches computers and mobile devices. Today, learners are always connected — whether they are ...
What Exactly Is a Cloud Architect and How Do You Become One?
One of the buzzwords surrounding the cloud that I'm sure you've heard is "Cloud Architect." In this article, I will outline my understanding of what a cloud architect does and I'll analyze the skills and certifications necessary to become one. I will also list some of the types of jobs ...
Disadvantages of Cloud Computing
If you want to deliver digital services of any kind, you’ll need to estimate all types of resources, not the least of which are CPU, memory, storage, and network connectivity. Which resources you choose for your delivery — cloud-based or local — is up to you. But you’ll definitely want...
What is Kubernetes? An Introductory Overview
In part 1 of my webinar series on Kubernetes, I introduced Kubernetes at a high level with hands-on demos aiming to answer the question, "What is Kubernetes?" After polling our audience, we found that most of the webinar attendees had never used Kubernetes before, or had only been expos...
How Does Cloud Computing Work?
Whether you're looking to become a cloud engineer or you're a manager wanting to gain more knowledge, learn the basics of how cloud computing works. Are you wondering about how cloud computing actually works? We can help explain the basic principles behind this technology. Cloud comput...
What is Ansible?
What is Ansible? Ansible is an open-source IT automation engine, which can remove drudgery from your work life, and will also dramatically improve the scalability, consistency, and reliability of your IT environment. We'll start to explore how to automate repetitive system administratio...
What is Puppet? Get Started With Our Course
When it comes to building and configuring IT infrastructure, especially across dozens or even thousands of servers, developers need tools that automate and streamline this process. Enter Puppet, one of the leading DevOps tools for automating delivery and operation of software no matter ...
2018 Was a Big Year for Content at Cloud Academy
As Head of Content at Cloud Academy I work closely with our customers and my domain leads to prioritize quarterly content plans that will achieve the best outcomes for our customers. We started 2018 with two content objectives: To show customer teams how to use Cloud Services to solv...
2019 Cloud Computing Predictions
2018 was a banner year in cloud computing, with Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP) all continuing to launch new and innovative services. We also saw growth among enterprises in the adoption of methodologies supporting the move toward cloud-native...
Introducing Assessment Cycles
Today, cloud technology platforms and best practices around them move faster than ever, resulting in a paradigm shift for how organizations onboard and train their employees. While assessing employee skills on an annual basis might have sufficed a decade ago, the reality is that organiz...
Cloud Skills: Transforming Your Teams with Technology and Data
How building Cloud Academy helped us understand the challenges of transforming large teams, and how data and planning can help with your cloud transformation. When we started Cloud Academy a few years ago, our founding team knew that cloud was going to be a revolution for the IT indu...
Announcing Skill Profiles Beta
Now that you’ve decided to invest in the cloud, one of your chief concerns might be maximizing your investment. With little time to align resources with your vision, how do you objectively know the capabilities of your teams? By partnering with hundreds of enterprise organizations, we’...