1. Home
  2. Training Library
  3. Jenkins CICD - Intro

Build Pipelines - Maven

play-arrow
Start course
Overview
DifficultyIntermediate
Duration1h 47m
Students145
Ratings
5/5
star star star star star

Description

Introduction

This training course introduces you to Jenkins, a popular open source tool used to perform Continuous Integration and Continuous Delivery.

We spend time early on reviewing the key Jenkins features and associated terminology. We then take you through a deep dive in configuring Jenkins to perform automated builds using the Jenkins web administration console in hands on demonstrations, ensuring that you become familiarised with Jenkins and how to administer it. We’ll demonstrate features such as:

  • Installing and setting up Jenkins
  • Creating and configuring pipelines manually
  • Creating and configuring pipelines using a Jenkinsfile
  • Configuring Jenkins pipelines using the Blue Ocean interface
  • Defining build execution environments using docker containers
  • Triggering build pipelines, manually and automatically
  • Navigating downstream and upstream build projects
  • Connecting to version control repositories such as GitHub
  • Setting up build pipelines for Java based projects using Gradle
  • Recording artifacts and test results
  • Setting up and scaling out Jenkins with multiple build agents and executors using SSH

Learning Objectives

What you'll learn:

  • The basic principles of build automation as implemented within Jenkins and how should be a applied to manage and maintain building, testing, and deploying your own enterprise software projects
  • How to install, setup, and configure Jenkins pipelines
  • The key differences between Jenkins declarative and scripted pipelines
  • How to manage build artifacts and test results
  • How to scale out Jenkins using Master and Build Agent setups using SSH
  • The benefits of codifying pipeline build instructions using a Jenkinsfile
  • How to leverage Docker containers within a Jenkins pipeline to provide additional build isolation and flexibility
  • How to install and use the newer more modern pipeline centric BlueOcean user interface
  • How to integrate and leverage 3rd party build tools like Gradle, Maven, Yarn, Webpack, and many more within a Jenkins pipeline

Demonstration

This training course provides many hands on demonstrations where you will observe first hand how to use Jenkins to build and release different types of software projects, for example:

  • Building a front end application which has been developed using the React Javascript framework, using technologies such as Webpack and Yarn
  • Building a back end application developed using Java, Gradle, and Docker, requiring Jenkins to compile the source code, packaging it into a WebArchive file, and then finally releasing it into a Tomcat based Docker image complete with Splunk based instrumentation for logging and monitoring

Prerequisites

  • A basic understanding of CICD, or Continuous Integration and Continuous Delivery
  • A basic understanding of software development and the software development life cycle
  • A basic understanding of version control and associated workflows

Intended Audience

  • Software Build and Release Engineers
  • Software Developers
  • DevOps Practitioners

Transcript

- [Instructor] Okay, welcome back! In this demonstration we're going to introduce you to a new form of a BuildJob. This time we're going to use a pipeline. So let's begin. 

We'll click on New Item. We'll give it the name BuildJob4. And this time instead of choosing freestyle project, we'll go with pipeline. So if you have a CICD workflow that is composed of multiple stages then using a pipeline is the way to go. So the next thing we'll do is click on OK. And for this first example we'll go straight down to pipelines, and if you haven't used one before, Jenkins provides a convenient method to create a sample pipeline for you. So we'll click on the try sample pipeline and we'll select the Gitub and Maven example. So let's go through this script. The script starts with the note keyword. This informs the pipeline script that it's a scripted pipeline and not a declarative pipeline. A declarative pipeline would begin with the pipeline keyword. 

Now we know that a scripted pipeline provides more flexibility in the way that we structure our script, but less guidance. The next thing we know is that we can define variables. So again a scripted pipeline uses the groovy language to layout the script. We have stages. So the first stage is our preparation stage and this does a GitClone from a sample Maven project, hosted on GitHub. It then sets up our Maven tooling. And we'll come back to this later on. The following stage is the stage that does the compilation. So within it, we can see that there are some control statements, an FL statement, to determine if it is running on a Unex build agent, then we should run this command. It's probably running on a Windows build agent, and therefore should run this build command. Finally we have the result stage. 

What we're doing here is we're specifying the location of our J unit tests, which will be captured by the Jenkins build when it's executed, and that we have a single archive, or we have an archive directory under the target folder, and that we want to archive all J files that have been created. So we'll go here and click apply. Save. We'll then click the Build Now link. Now this build will fail, because we haven't set up the Maven tooling. I want to show you this before I do set it up. So there it's failed. We'll navigate in to the console output. When we scroll down, we can see that an error has indeed happened, and that it's reporting no tool named M3 found. So let's now fix this. So we'll click on the Back to Project link. We'll click on configure. Scroll back down to the script. 

Now what we need to set up is the M3 tool here. Now the way we do this is we need to take the identifier, which is M3, we'll go back to Jenkins, click on manage Jenkins. And then we click on the global tool configuration item. Here we scroll down to Maven and we click on the add Maven button. We add a new identifier that we took from our script. We ensure that the install automatically is enabled, and we'll install the latest version, 3.6.0. We click apply. And then we click save. We'll go back to the homepage for Jenkins, and then on BuildJob4, we'll execute a second build job. We'll click on the progress bar to be taken into the console output. And this time we can see that Maven has now kicked in, and is actually doing the build. Okay. 

So the build job has completed, and everything looks good. We'll scroll back up to the top, and we'll go back to the Project View. So let's review our dashboard for BuildJob4. So BuildJob2 has run. This time, each stage, the preparation stage, the build stage, and the results stage, have completed successfully as per the green boxes, whereas the previous run failed with a red box. From this page, we can see that we've got our artifact that has been stored. We can download this. And if we scroll down, we can also see the latest test result. So if we click on this, it will take us into test results for this particular build job. We can click on the package, navigate into a class, into a test, and we can see that the test has passed. Okay, so in summary, we've shown you how to quickly set up a pipeline, if you've never used one before. 

You can do so by installing the sample Maven project that Jenkins provides. When we did that, we also showed you how to set up global tooling, in this case, how to configure the Maven tool with the M3 identifier that was used within the sample Maven script.

About the Author

Students11957
Labs28
Courses65
Learning paths14

Jeremy is the DevOps Content Lead at Cloud Academy where he specializes in developing technical training documentation for DevOps.

He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 20+ years. In recent times, Jeremy has been focused on DevOps, Cloud, Security, and Machine Learning.

Jeremy holds professional certifications for both the AWS and GCP cloud platforms.