1. Home
  2. Training Library
  3. Jenkins CICD - Advanced

Blue Ocean - Build Package and Publish Docker Image to Docker Hub

play-arrow
Start course
Overview
DifficultyBeginner
Duration1h 12m
Students52
Ratings
5/5
star star star star star

Description

Introduction

This training course introduces you to Jenkins, a popular open source tool used to perform Continuous Integration and Continuous Delivery.

We spend time early on reviewing the key Jenkins features and associated terminology. We then take you through a deep dive in configuring Jenkins to perform automated builds using the Jenkins web administration console in hands on demonstrations, ensuring that you become familiarised with Jenkins and how to administer it. We’ll demonstrate features such as:

  • Installing and setting up Jenkins
  • Creating and configuring pipelines manually
  • Creating and configuring pipelines using a Jenkinsfile
  • Configuring Jenkins pipelines using the Blue Ocean interface
  • Defining build execution environments using docker containers
  • Triggering build pipelines, manually and automatically
  • Navigating downstream and upstream build projects
  • Connecting to version control repositories such as GitHub
  • Setting up build pipelines for Java based projects using Gradle
  • Recording artifacts and test results
  • Setting up and scaling out Jenkins with multiple build agents and executors using SSH

Learning Objectives

What you'll learn:

  • The basic principles of build automation as implemented within Jenkins and how should be a applied to manage and maintain building, testing, and deploying your own enterprise software projects
  • How to install, setup, and configure Jenkins pipelines
  • The key differences between Jenkins declarative and scripted pipelines
  • How to manage build artifacts and test results
  • How to scale out Jenkins using Master and Build Agent setups using SSH
  • The benefits of codifying pipeline build instructions using a Jenkinsfile
  • How to leverage Docker containers within a Jenkins pipeline to provide additional build isolation and flexibility
  • How to install and use the newer more modern pipeline centric BlueOcean user interface
  • How to integrate and leverage 3rd party build tools like Gradle, Maven, Yarn, Webpack, and many more within a Jenkins pipeline

Demonstration

This training course provides many hands on demonstrations where you will observe first hand how to use Jenkins to build and release different types of software projects, for example:

  • Building a front end application which has been developed using the React Javascript framework, using technologies such as Webpack and Yarn
  • Building a back end application developed using Java, Gradle, and Docker, requiring Jenkins to compile the source code, packaging it into a WebArchive file, and then finally releasing it into a Tomcat based Docker image complete with Splunk based instrumentation for logging and monitoring

Prerequisites

  • A basic understanding of CICD, or Continuous Integration and Continuous Delivery
  • A basic understanding of software development and the software development life cycle
  • A basic understanding of version control and associated workflows

Intended Audience

  • Software Build and Release Engineers
  • Software Developers
  • DevOps Practitioners

Transcript

- [Instructor] Okay, welcome back! In this demonstration, we're going to upgrade an existing pipeline that we built in the BlueOcean user interface. 

So, if you recall, a few demonstrations earlier, we built the devops-webapp2 pipeline. If we open this up in BlueOcean, you'll recall that it was composed of three stages: a clone stage that today, git clone of our GitHub repository, a build stage that does a Gradle build of the source code, and it finished with a publish stage which simply took the outputs of the build stage and archived them. In this particular demonstration, we're going to remodel the pipeline. We're going to update it so that we take our build artifacts out of the build stage, and compile them into a Docker image. Then, later we'll take that Docker image, and push it up into a public Docker Hub cloudacademydevops repository, which then, later on, we'll pull from to our local workstation, and then spin up the final product. 

Okay, let's go back to that Jenkins. We'll remove the existing publish stage. We'll select the build stage. Click on the shell script. We'll delete it, and we'll simply replace it with this particular script. We're establishing a release name for the web archive file. We then paste that name in as part of our Gradle build command. We perform a directory listing on the build/labs directory, and then, importantly, we copy the webapp.war file to the Docker directory. Now, the reason we're doing that is, if we take a look at our source code within GitHub, we have a Docker directory, and inside that is a Docker file. And the Docker file will be used in the next stage within our pipeline to actually compile the Docker image. Okay, back to Jenkins. 

We'll now add in a packaging stage. Under steps, we'll add another shell script, and we'll paste the following commands. So here, we're instructing the build to go into the Docker directory. We'll then run Docker build, where it will go in, use the Docker file within the current directory, and perform a compilation of the Docker image. We'll then tag the resulting image, and we tag it with the Jenkins build ID for the current job as well as with the latest tag. And, finally, as part of the script, we call Docker images to list out all of the Docker images on the current host that this particular build has actually occurred on, and we should see two new images for the current build job. 

Next, we'll add in a final stage. Again, we'll call it "Publish." We add a step, and in this particular stage, we're going to add the run arbitrary pipeline script. Here, we paste in the following script. Now, we'll need to explain this. So, in order for us to actually push up the image to our Docker Hub cloudacademydevops repository, we need the pipeline to authenticate to Docker Hub. And so, what we're doing here is that we're setting up the credentials that the pipeline will use to authenticate. So, it calls the withCredentials function, it passes in a credential ID. In this case, it's called "ca-dockerhub," and then what will happen in the background is the pipeline will go and look in the Jenkins credential store for a credential with this ID, and when it finds it, it sets the username into this variable, and the password into this variable. 

We then go and call Docker login with those particular variables, the DOCKER_USERNAME and DOCKER_PASSWORD, and this performs an authentication to Docker Hub. We push the image that we've just built up into the repository, one with the Jenkins build ID, and then the other with the latest tag. 

Okay, let's open up the classic Jenkins interface in a new browser tab. We click on Jenkins. We then click on Credentials. We click on Global, and then Add Credentials. For the ID, we add in the same credential ID that we're using in our script, ca-dockerhub, and then we need to set the username and the password. We click OK. We can close this tab now. And then back within our BlueOcean pipeline, we have everything complete, and we can simply click the Save button. This will then regenerate the Jenkins file, and commit it back to our repository. We'll give it the following description, and we click Save & Run. So our new build job is up and running. We're now in the build stage, so we're doing the Gradle build. We've done the Docker image packaging, and now we're publishing it into Docker Hub. And here we can see the push of the Docker image as it's going up into our Docker Hub cloudacademydevops repository. 

Okay, so, the complete pipeline has finished. Let's now jump over into Docker Hub, and if we do a reload, you can see that the repository now exists. And if we never get into it, you can see the repository has two tags, one with the latest tag, and the other with the Jenkins build ID for the current job. So now what we'll do is we'll jump over into the local workstation terminal, and we'll do a Docker pull from the Docker Hub public cloudacademydevops repository with the latest tag for the Docker image that we just built. So we'll kick this off, and this will take a few minute to complete. 

Okay, the Docker pull to the local workstation is completed successfully. If we do a Docker images, we can indeed see that we have the latest image that we've just compiled in our updated pipeline. So now we're at a stage where, hopefully, we'll be able to browse to our webapp as soon as we spin up a instance of this container. So let's try it. We'll do a Ctrl + L to clear the terminal. We'll then paste a Docker run command. I won't go into all of the details of that particular command, other than to say that it's listening on port 8000. 

We can see here that tomcat within our Docker image is now in a running state. Therefore, we should be good to go in terms of jumping back to our browser, starting a new browser tab, and this time browsing to localhost.8000/webapp/home. And, excellent, our Docker image that we compiled within our pipeline is being pulled down successfully, and an instance of it has been created, and we're successfully browsing to it. And it shows the apparent expressiveness of what you can accomplish within BlueOcean in creating CICD pipelines. 

Okay, let's jump back to our Jenkins pipeline, and we'll quickly summarize what we accomplished. So within our pipeline, we performed a number of updates. The key ones were to take the output of the build file, package it into a Docker image, and then push that Docker image up into Docker Hub.

About the Author

Students11052
Labs28
Courses65
Learning paths15

Jeremy is the DevOps Content Lead at Cloud Academy where he specializes in developing technical training documentation for DevOps.

He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 20+ years. In recent times, Jeremy has been focused on DevOps, Cloud, Security, and Machine Learning.

Jeremy holds professional certifications for both the AWS and GCP cloud platforms.