1. Home
  2. Training Library
  3. Jenkins CICD - Intro

Build Pipelines - Declarative

Start course
Overview
DifficultyIntermediate
Duration1h 47m
Students111
Ratings
5/5

Description

Introduction

This training course introduces you to Jenkins, a popular open source tool used to perform Continuous Integration and Continuous Delivery.

We spend time early on reviewing the key Jenkins features and associated terminology. We then take you through a deep dive in configuring Jenkins to perform automated builds using the Jenkins web administration console in hands on demonstrations, ensuring that you become familiarised with Jenkins and how to administer it. We’ll demonstrate features such as:

  • Installing and setting up Jenkins
  • Creating and configuring pipelines manually
  • Creating and configuring pipelines using a Jenkinsfile
  • Configuring Jenkins pipelines using the Blue Ocean interface
  • Defining build execution environments using docker containers
  • Triggering build pipelines, manually and automatically
  • Navigating downstream and upstream build projects
  • Connecting to version control repositories such as GitHub
  • Setting up build pipelines for Java based projects using Gradle
  • Recording artifacts and test results
  • Setting up and scaling out Jenkins with multiple build agents and executors using SSH

Learning Objectives

What you'll learn:

  • The basic principles of build automation as implemented within Jenkins and how should be a applied to manage and maintain building, testing, and deploying your own enterprise software projects
  • How to install, setup, and configure Jenkins pipelines
  • The key differences between Jenkins declarative and scripted pipelines
  • How to manage build artifacts and test results
  • How to scale out Jenkins using Master and Build Agent setups using SSH
  • The benefits of codifying pipeline build instructions using a Jenkinsfile
  • How to leverage Docker containers within a Jenkins pipeline to provide additional build isolation and flexibility
  • How to install and use the newer more modern pipeline centric BlueOcean user interface
  • How to integrate and leverage 3rd party build tools like Gradle, Maven, Yarn, Webpack, and many more within a Jenkins pipeline

Demonstration

This training course provides many hands on demonstrations where you will observe first hand how to use Jenkins to build and release different types of software projects, for example:

  • Building a front end application which has been developed using the React Javascript framework, using technologies such as Webpack and Yarn
  • Building a back end application developed using Java, Gradle, and Docker, requiring Jenkins to compile the source code, packaging it into a WebArchive file, and then finally releasing it into a Tomcat based Docker image complete with Splunk based instrumentation for logging and monitoring

Prerequisites

  • A basic understanding of CICD, or Continuous Integration and Continuous Delivery
  • A basic understanding of software development and the software development life cycle
  • A basic understanding of version control and associated workflows

Intended Audience

  • Software Build and Release Engineers
  • Software Developers
  • DevOps Practitioners

Transcript

- [Instructor] Okay, welcome back! In this demonstration, we're going to build another pipeline but, instead of using a scripted pipeline, we're going to configure it as a declarative pipeline where the script is based on a schema that is the declarative pipeline schema. Now, when we do this, the schema gives us guidance as to how we structure our pipeline workflow, whereas a scripted pipeline gives you full flexibility and basically allows you to program the entire pipeline. A declarative pipeline, instead, gives you less control, but gives you more guidance. So, let's get into it. 

We'll click new item. We'll call it BuildJob7. We select pipeline, click okay. We'll click on the pipeline tab to be taken down to the pipeline area. We leave the definition as pipeline script. And, this time, we're going to create a declarative pipeline. So we begin with the keyword pipeline. Previously, we began with the keyword node, but a declarative pipeline starts with the keyword pipeline. Curly brackets. And, at this stage, we begin to build out the rest of the pipeline using the declarative schema. Now to help us, we can click on the pipeline syntax link down the bottom. This opens up another tab, and then from here we can click on the declarative directive generator link. Now, this is a useful utility to help us generate our declarative script. So, let's begin. We'll start with an agent, and we'll specify a label here, and we'll state the label to be master. We click on the generate declarative directive button, and what Jenkins does for us, is actually generates the actual script, that we can then go and paste into our pipeline. So, what we'll do is we'll copy that. 

We'll also jump into the terminal. We'll create a new directory called BuildJob7. We'll cd into that. And from here, we'll start up visual code. We'll now create a new file called Jenkinsfile. Enter. So I'm using Jenkinsfile because, hopefully, visual code will give me some formatting capabilities on the declarative pipeline script as we weave it together. So, we'll go back to the directive generator. We'll copy out the agent block, and we'll paste it into our Jenkinsfile. The next thing we'll do is we'll generate a tools block. So, we select tools, we click on add, and we want to add gradle. Now, the generator is clever enough to know that we've already got a tool configured within the manage Jenkins global tools configuration area, and that is the gradle dash four dot 10 dot two tool that we've previously configured. So we'll select that and we'll click generate again. We'll copy this block of code. We'll go back to visual code into our Jenkinsfile, and we'll paste it. Okay, the next directive we'll use is an environment directive. So we'll click add, and we'll create an environment variable code version and we'll set the value to jellybean. 

We click the generate button. We then copy the environment block. Back to our Jenkinsfile, and we'll paste again. So, as you can see, we go through this type of workflow where we generate, copy, paste, generate, copy, paste. So, we'll go back to our generator. This time we'll click on stage and we'll create a new stage. This time we'll call it checkout 'cause we're going to use this stage to check out the code from github. The stage will contain steps. We click the generate button again and we copy the stage block. Paste. We'll generate a second stage. This one we'll call details. We'll click add, and this time we'll add in a when condition. So the when condition will be based on our previous environment variable that we set up. The environment variable name that we created was version and the value that we set to it was jellybean. So when this environment variable is true, then the step will be executed. We click the generate button again and we copy the resulting stage block. 

We go back to the generator and we will create a third stage. This one will be our gradle build stage. We click generate again, and we copy the resulting stage block. Okay, so we've got our three stages. We've got our agent, tools and environment blocks. So, we have to do some final formatting. We have to wrap it all in our pipeline keyword to indicate that this is a declarative script. We'll put all of our generated declarative script code inside this. And then, finally, we need to actually fill out each of the steps. So, for the checkout, I'll paste in a block of code that does a checkout from github. For details, I'm going to run a couple of echo statements. All we're going to do is echo out the build ID, the Jenkins URL, and the version variable that we set up here. And then finally, for the build stage, we'll just run our gradle build tool. So one final thing we need to do within our declarative pipeline script here is to add in a stages block. And then our three stages that we generated and updated need to sit within that. 

Okay, so I think our declarative script is completed. We'll copy all of it. We'll go back in to our pipeline script and replace it completely, we'll click apply, click save and, at this stage, we're good to go. We click on the build now and a build is scheduled. And, so, we're off and running. So we click the progress bar, and we can see the build as it happens. Okay, so they're completed successfully. If we go back to the project, we can see that each of the three stages have completed and that the tool install has also completed successfully. Okay, if we go back to configure, this time we'll update the when clause on the details stage, and we'll change the value to chocolate. So this time, this stage should be skipped because the current version environment variable is equal to jellybean. 

We'll click apply, click save, we'll re-execute the build job. In here you can indeed see the detail stage has been skipped. So that shows you how to use when conditions within a declarative pipeline script. Okay, so let's quickly summarize what we've accomplished. We'll click back into configure, scroll down to the pipeline script area. So, we offered a declarative pipeline. A declarative pipeline begins with the pipeline keyword. We set up the agent so that this particular build job runs on the master agent. We leveraged the gradle tool that we had previously configured within manage Jenkins global tool configuration. We specified an arbitrary environment value. 

We then set up three stages. The checkout stage, the details stage, and the build stage. The checkout stage did a checkout from our repository. Checked out the master branch, and reset the workspace that it gets checked out into. In the details stage, we simply echoed out the build ID, the Jenkins URL and the environment version. And then finally, in the build stage we did a cradle build.

About the Author

Students10941
Labs28
Courses65
Learning paths15

Jeremy is the DevOps Content Lead at Cloud Academy where he specializes in developing technical training documentation for DevOps.

He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 20+ years. In recent times, Jeremy has been focused on DevOps, Cloud, Security, and Machine Learning.

Jeremy holds professional certifications for both the AWS and GCP cloud platforms.