1. Home
  2. Training Library
  3. Jenkins CICD - Advanced

Docker Based Builds - Install and Configure

play-arrow
Start course
Overview
DifficultyBeginner
Duration1h 12m
Students64
Ratings
5/5
star star star star star

Description

Introduction

This training course introduces you to Jenkins, a popular open source tool used to perform Continuous Integration and Continuous Delivery.

We spend time early on reviewing the key Jenkins features and associated terminology. We then take you through a deep dive in configuring Jenkins to perform automated builds using the Jenkins web administration console in hands on demonstrations, ensuring that you become familiarised with Jenkins and how to administer it. We’ll demonstrate features such as:

  • Installing and setting up Jenkins
  • Creating and configuring pipelines manually
  • Creating and configuring pipelines using a Jenkinsfile
  • Configuring Jenkins pipelines using the Blue Ocean interface
  • Defining build execution environments using docker containers
  • Triggering build pipelines, manually and automatically
  • Navigating downstream and upstream build projects
  • Connecting to version control repositories such as GitHub
  • Setting up build pipelines for Java based projects using Gradle
  • Recording artifacts and test results
  • Setting up and scaling out Jenkins with multiple build agents and executors using SSH

Learning Objectives

What you'll learn:

  • The basic principles of build automation as implemented within Jenkins and how should be a applied to manage and maintain building, testing, and deploying your own enterprise software projects
  • How to install, setup, and configure Jenkins pipelines
  • The key differences between Jenkins declarative and scripted pipelines
  • How to manage build artifacts and test results
  • How to scale out Jenkins using Master and Build Agent setups using SSH
  • The benefits of codifying pipeline build instructions using a Jenkinsfile
  • How to leverage Docker containers within a Jenkins pipeline to provide additional build isolation and flexibility
  • How to install and use the newer more modern pipeline centric BlueOcean user interface
  • How to integrate and leverage 3rd party build tools like Gradle, Maven, Yarn, Webpack, and many more within a Jenkins pipeline

Demonstration

This training course provides many hands on demonstrations where you will observe first hand how to use Jenkins to build and release different types of software projects, for example:

  • Building a front end application which has been developed using the React Javascript framework, using technologies such as Webpack and Yarn
  • Building a back end application developed using Java, Gradle, and Docker, requiring Jenkins to compile the source code, packaging it into a WebArchive file, and then finally releasing it into a Tomcat based Docker image complete with Splunk based instrumentation for logging and monitoring

Prerequisites

  • A basic understanding of CICD, or Continuous Integration and Continuous Delivery
  • A basic understanding of software development and the software development life cycle
  • A basic understanding of version control and associated workflows

Intended Audience

  • Software Build and Release Engineers
  • Software Developers
  • DevOps Practitioners

Transcript

- [Instructor] Okay, welcome back! In this next demonstration we're going to set up Docker on our build agent. 

In doing so, this will allow us to accomplish two goals. The first of which is to use Docker as an extra method of isolation for our build jobs. So Jenkins has a really neat feature where we can instruct a build job to occur within a Docker container. The second goal is that by installing Docker, it gives us an ability to use it as a packaging format. So we can perform a build job that takes our artifacts and then compiles them into a Docker image. And then from there we can take our Docker image and use it within our production infrastructure by spawning containers from that image. So we now ssh into our build agent. And if you've been following along you'll know that we originally set this up using Ubuntu 18 dot zero four. Now if we attempt to run the Docker command it will fail because it hasn't been previously installed. But conveniently it will echo out the install instructions to actually install it. So we can simply run sudo apt-get install. And we'll install Docker dot IO. 

Okay, that's completed successfully. And we can see that the Docker command is now there. The next thing we'll do is we'll attempt to interrogate the Docker service by running Docker PS. Now this should fail. So by default, after the installation, only the route user can perform this command. So if we upgrade to route we should be able to now run Docker PS. And we can. We can also run Docker info. Okay. We'll cat out the etc group file and here we can see that a Docker group has been automatically created for us during the install. So what we wanna do is modify that particular group and add to it the Jenkins user. And the reason we wanna do that is because at build time our builds are going to operate under the Jenkins user. We'll exit out and this time we'll sudo into the Jenkins user. So if I do who am I, I'm Jenkins, and we'll test to see that we can again query the Docker service, and we can. So that's a great result. And again we can do Docker info. 

Okay, so that completes that side of the installation. If we go back to Jenkins itself, what we can do now is create a new build job, we'll call it Build Job 10, it'll be a Pipeline, click okay. We jump down to the Pipeline section. So all we're going to do is quickly run this handcrafted script which forces the Pipeline to execute on our Agent One where we've just installed Docker, and we'll interrogate the user, which we know will be the Jenkins user, and then we'll just confirm that we can indeed connect to the docker service running on Build Agent One. Click apply. Click save. And we'll manually trigger the build job. Okay, that's up and running. Okay, as you can see, the current build has failed, we weren't able to connect to the Docker service as the Jenkins user on Build Agent One. 

Now the reason for this is we need to relaunch the build agent itself. Because we need to relaunch the shell that the Jenkins user is operating within for it to pick up the modified permissions that it now has. So, let's go over to Jenkins, we'll click on Manage Jenkins, we'll scroll down to Manage Nodes, we hover over Agent One, we select disconnect, then yes. So we've now disconnected. We'll relaunch the agent. Okay, the agent is now back online. And this time if we go back to our build job, Build Job 10, we trigger another manual build and this time it's worked. So if we go into the Console Output this time we can see that when we queried the Docker info that's connected to the Docker service on the build agent and has returned us all this information. Additionally we were able to query Docker PS. And as expected, there are no currently running containers. 

Okay, let's quickly summarize what we just performed. So on Build Agent One we jumped over into our terminal, we ssh'd onto Build Agent One, and we installed the Docker service. We then ensured that the Jenkins user, which is used at build time, had rights to query the local Docker service. And then finally, we set up a build job which queried the Docker info and inspected any running containers. We perform these Docker commands just to confirm that the configuration of Docker on Build Agent One is correctly configured.

About the Author

Students12091
Labs28
Courses65
Learning paths14

Jeremy is the DevOps Content Lead at Cloud Academy where he specializes in developing technical training documentation for DevOps.

He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 20+ years. In recent times, Jeremy has been focused on DevOps, Cloud, Security, and Machine Learning.

Jeremy holds professional certifications for both the AWS and GCP cloud platforms.