1. Home
  2. Training Library
  3. Jenkins CICD - Advanced

Jenkins Docker Builds

play-arrow
Start course
Overview
DifficultyBeginner
Duration1h 12m
Students52
Ratings
5/5
star star star star star

Description

Introduction

This training course introduces you to Jenkins, a popular open source tool used to perform Continuous Integration and Continuous Delivery.

We spend time early on reviewing the key Jenkins features and associated terminology. We then take you through a deep dive in configuring Jenkins to perform automated builds using the Jenkins web administration console in hands on demonstrations, ensuring that you become familiarised with Jenkins and how to administer it. We’ll demonstrate features such as:

  • Installing and setting up Jenkins
  • Creating and configuring pipelines manually
  • Creating and configuring pipelines using a Jenkinsfile
  • Configuring Jenkins pipelines using the Blue Ocean interface
  • Defining build execution environments using docker containers
  • Triggering build pipelines, manually and automatically
  • Navigating downstream and upstream build projects
  • Connecting to version control repositories such as GitHub
  • Setting up build pipelines for Java based projects using Gradle
  • Recording artifacts and test results
  • Setting up and scaling out Jenkins with multiple build agents and executors using SSH

Learning Objectives

What you'll learn:

  • The basic principles of build automation as implemented within Jenkins and how should be a applied to manage and maintain building, testing, and deploying your own enterprise software projects
  • How to install, setup, and configure Jenkins pipelines
  • The key differences between Jenkins declarative and scripted pipelines
  • How to manage build artifacts and test results
  • How to scale out Jenkins using Master and Build Agent setups using SSH
  • The benefits of codifying pipeline build instructions using a Jenkinsfile
  • How to leverage Docker containers within a Jenkins pipeline to provide additional build isolation and flexibility
  • How to install and use the newer more modern pipeline centric BlueOcean user interface
  • How to integrate and leverage 3rd party build tools like Gradle, Maven, Yarn, Webpack, and many more within a Jenkins pipeline

Demonstration

This training course provides many hands on demonstrations where you will observe first hand how to use Jenkins to build and release different types of software projects, for example:

  • Building a front end application which has been developed using the React Javascript framework, using technologies such as Webpack and Yarn
  • Building a back end application developed using Java, Gradle, and Docker, requiring Jenkins to compile the source code, packaging it into a WebArchive file, and then finally releasing it into a Tomcat based Docker image complete with Splunk based instrumentation for logging and monitoring

Prerequisites

  • A basic understanding of CICD, or Continuous Integration and Continuous Delivery
  • A basic understanding of software development and the software development life cycle
  • A basic understanding of version control and associated workflows

Intended Audience

  • Software Build and Release Engineers
  • Software Developers
  • DevOps Practitioners

Transcript

- [Instructor] Welcome back! In this lecture, we'll review how Docker has been integrated into the Jenkins build system, such that it can be configured within a pipeline to provide an extra layer of build isolation and/or as a convenient method of providing specialized and customized build environments. Okay, let's begin. 

As of version 2.5 onwards of the Jenkins pipeline plugin, you can script your pipelines to use Docker as the underlying build mechanism. There's a lot of extra flexibility in the way that Docker can be used to perform builds. For example, consider the following options. A single Docker container is configured at the pipeline level to be used for the entire pipeline build execution. An individual Docker container is configured per stage. The resulting full pipeline build execution is segmented over multiple Docker containers. And a Dockerfile can be hosted in the project root directory. 

The pipeline configuration is then instructed to perform a Docker build to create a customized Docker image, complete with updated build tool chain, for which is then immediately used to perform the actual pipeline build execution. Let's now take a closer look at each of these and how to set up and use Docker at build time. For starters, we need to have a Docker daemon or service available on one or many of the build agents. Installing Docker on a build agent is fairly simple depending on the underlying operating system and its particular requirements. For example, the quickest way to install Docker on the most recent version of an Ubuntu-based build agent is to perform the following commands in sequence. The last command requires you to restart the shell in which the Jenkins user has been logged into to pick up the new permissions associated with being added to the Docker group. Once done, access to the Docker daemon should be granted to the Jenkins user and we are adding the Jenkins user here because that is typically the user that executes the actual Jenkins build commands. 

To confirm that the Jenkins user on the build agent now has access to the Docker daemon, run either or both of the following commands as the Jenkins user. If they succeed, then all is well. Now that the build agent or agents have been installed with a Docker daemon, we can go ahead and begin to utilize them by updating our Jenkins pipeline scripts to perform a build using Docker at build time. A couple of quick examples are now provided. Say, for example, you were building a Microsoft .NET Core application. You could use the following declarative scripted pipeline, which leverages the official Microsoft .NET Core SDK Docker image hosted on Docker Hub. The key point in this example is that the Docker image is configured at the root of the pipeline script, meaning that the entire build execution takes place within the one container. Alternatively, you might have a multi-staged pipeline that is used to build and test, say, for example, a Java-based application. The first stage in this example uses a Maven-based Docker image to compile the project with a second stage using a JDK-based Docker image to run JUnit tests. This example highlights the fact that dedicated Docker containers are spun up for the individual pipeline stages. 

The final thing we'll briefly cover off in this lecture, since we're currently discussing Docker, is that Docker cannot only be used to execute whole or parts of a build pipeline, but it can also be used as an end format for hosting build artifacts themselves. That is to say we can configure a Jenkins pipeline to create and publish a Docker image that hosts our build artifacts and then, push the resulting Docker image into a public or private Docker registry, for example Docker Hub. In this scenario, the Docker image itself becomes the artifact. Let's walk through a quick example. This particular scripted pipeline can be broken down as such. One, four main stages; clone, build, Docker image build, and Docker image push. Two, the clone stage simply performs a git clone of the code repository hosted on GitHub. Three, the build stage uses Gradle to drive the core compilation process to produce a web archive file. Four, the Docker image build stage takes the web archive file and embeds it in a Tomcat servlet container, which is then packaged into a Docker image. Five, the resulting Docker image is then pushed up to the public Docker Hub registry, where it can later be pulled down and launched as a Docker container locally. The end result here is a browsable web application. 

Okay, that completes this lecture on how to utilize Docker to execute pipelines and/or to create Docker images hosting build artifacts. 

Go ahead and close this lecture and we'll see you shortly in the next one.

About the Author

Students11038
Labs28
Courses65
Learning paths15

Jeremy is the DevOps Content Lead at Cloud Academy where he specializes in developing technical training documentation for DevOps.

He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 20+ years. In recent times, Jeremy has been focused on DevOps, Cloud, Security, and Machine Learning.

Jeremy holds professional certifications for both the AWS and GCP cloud platforms.