The course is part of this learning path
This course introduces you to Jenkins, a popular open-source tool used to perform Continuous Integration and Continuous Delivery.
We spend time early on reviewing the key Jenkins features and associated terminology. We then take you through a deep dive in configuring Jenkins to perform automated builds using the Jenkins web administration console in hands-on demonstrations, ensuring that you become familiarised with Jenkins and how to administer it. We’ll demonstrate features such as:
- Creating and configuring pipelines using a Jenkinsfile
- Configuring Jenkins pipelines using the Blue Ocean interface
- Defining build execution environments using docker containers
- Setting up and scaling out Jenkins with multiple build agents and executors using SSH
- Setting up build pipelines for Java-based projects using Gradle
What you'll learn:
- How to scale out Jenkins using Master and Build Agent setups using SSH
- The benefits of codifying pipeline build instructions using a Jenkinsfile
- How to leverage Docker containers within a Jenkins pipeline to provide additional build isolation and flexibility
- How to install and use the newer more modern pipeline centric BlueOcean user interface
- How to integrate and leverage 3rd party build tools like Gradle, Maven, Yarn, Webpack, and many more within a Jenkins pipeline
This training course provides many hands-on demonstrations where you will observe first hand how to use Jenkins to build and release different types of software projects, for example:
- Building a back end application developed using Java, Gradle, and Docker, requiring Jenkins to compile the source code, packaging it into a WebArchive file, and then finally releasing it into a Tomcat based Docker image complete with Splunk based instrumentation for logging and monitoring
- A basic understanding of CICD, or Continuous Integration and Continuous Delivery
- A basic understanding of software development and the software development life cycle
- A basic understanding of version control and associated workflows
- Software Build and Release Engineers
- Software Developers
- DevOps Practitioners
The following supporting Jenkins documentation is available online:
- [Instructor] Okay, welcome back! In the previous demonstration we installed Docker on our build agent. This now gives us the capability of running our Jenkins build jobs inside a docker container. This is a really powerful feature that enables us to actually have extra isolation for our build jobs.
If we're running a particular build job, there is dependencies on some extra tools that need to be installed at build time. We can use our docker container and perform the installation of the required tools, such that the overall server itself isn't impacted once the job completes. The integrity of the server will remain as it was before the job even started because the installation of those tools only occurs in the docker container that is created to support the build job. In this demonstration, I'm going going to quickly show you how to use docker to run your own build jobs. Let's jump over into Visual Code, and we'll take a quick look at the pipeline script that we'll shortly paste into our new build job. Here we can see that we have a script that is composed of two stages. The first stage is going to perform a Maven build.
The second stage is going to run some Java commands. For this to run, it's going to leverage the openjdk docker image. Now, each stage still runs on the same build agent. Within the first stage, we're specifying that we want to use a docker container and that we want it to run on the agent label agent1. Likewise, with our JDK stage, we're indicating that we want this to run as a docker container based on the openjdk image on the build agent that is labeled agent1. Before we set this up within Jenkins, we want to update and use the latest docker images for both our Maven stage and our JDK stage. We'll jump over into the browser, and we'll go to dockerhub We'll search firstly for Maven. We'll filter on Official images, and here we can see the Maven official image. We'll click on it. What we'll do is we'll take the 3.6.0 alpine version. We'll copy this tag, back to our script, and we update it.
Next, we need to do the same thing for the open JDK image. This time, we'll search for openjdk, Official images only, and here it is. This time we'll search for the latest tag and latest is 11.0.1-jdk. We'll take this, and we'll update our script again. We'll copy it all, and now we'll return to Jenkins. We'll create a new build job, BuildJob11. Pipeline project, click okay. We'll jump straight to the Pipeline section. We'll paste our pipeline script here. We'll click Apply. We'll click Save. We simply execute the build job. Now scheduled and executing. We'll navigate into it.
Okay, the full build job has completed. If we take a look back, it realized it didn't have this particular docker image for Maven locally, so it went ahead and performed a docker pull on that docker image. Once that completed, it fired up a docker container based on that image. It echoed out our "Hello, Maven" command. When we queried for the Maven version that has run and returned as expected Apache Maven 3.6.0. Following on from that, we then attempted to run the next stage which was based on the openjdk docker image. Again, that wasn't found locally, so the build agent went ahead and pulled it down. After the download completes, it fires up a container based on that image. It prints out our "Hello, JDK" statement. Then it queries the version of the JDK that's been installed, and as expected, it's returned openjdk version 11.0.1. Let's quickly summarize what we just completed. Again looking at the script that we ran, it is multi-staged where each stage is based on it's own docker container that runs on the same build agent.
About the Author
Jeremy is the DevOps Content Lead at Cloud Academy where he specializes in developing technical training documentation for DevOps.
He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 20+ years. In recent times, Jeremy has been focused on DevOps, Cloud, Security, and Machine Learning.
Jeremy holds professional certifications for both the AWS and GCP cloud platforms.