The course is part of these learning paths
This course introduces you to Jenkins, a popular open-source tool used to perform Continuous Integration and Continuous Delivery.
We review the key Jenkins features and associated terminology. We then take you through a deep dive in configuring Jenkins to perform automated builds using the Jenkins web administration console in hands-on demonstrations, familiarising you with Jenkins and how to administer it. We’ll demonstrate features such as:
- Installing and setting up Jenkins
- Creating and configuring pipelines manually
- Creating and configuring pipelines using a Jenkinsfile
- Triggering build pipelines, manually and automatically
- Navigating downstream and upstream build projects
- Connecting to version control repositories such as GitHub
- Setting up build pipelines for Java-based projects using Gradle
- Recording artifacts and test results
Learning Objectives
What you'll learn:
- The basic principles of build automation as implemented within Jenkins and how they should be applied to manage and maintain building, testing, and deploying your own enterprise software projects
- How to install, set up, and configure Jenkins pipelines
- The key differences between Jenkins declarative and scripted pipelines
- How to manage build artifacts and test results
- How to integrate and leverage third-party build tools like Gradle, Maven, Yarn, Webpack, and many more within a Jenkins pipeline
Demonstration
This training course provides many hands-on demonstrations where you will observe first hand how to use Jenkins to build and release different types of software projects, for example:
- Building a front-end application which has been developed using the React Javascript framework, using technologies such as Webpack and Yarn
- Building a back-end application developed using Java, Gradle, and Docker, requiring Jenkins to compile the source code, packaging it into a WebArchive file, and then finally releasing it into a Tomcat-based Docker image complete with Splunk-based instrumentation for logging and monitoring
Prerequisites
- A basic understanding of CICD, or Continuous Integration and Continuous Delivery
- A basic understanding of software development and the software development life cycle
- A basic understanding of version control and associated workflows
Intended Audience
- Software Build and Release Engineers
- Software Developers
- DevOps Practitioners
Resources
The following GitHub repo contains sample Jenkins configurations used within the provided demonstrations:
Supporting Documentation
The following supporting Jenkins documentation is available online:
- https://www.jenkins.io/doc/book/
- https://www.jenkins.io/doc/book/pipeline
- https://www.jenkins.io/doc/book/pipeline/syntax
- https://www.jenkins.io/doc/book/pipeline/pipeline-best-practices/
- [Instructor] Welcome back! In this lecture, we'll introduce you to the Jenkins Declarative Pipeline syntax. This syntax is perhaps less flexible, but provides a framework for laying out pipeline workflows in an opinionated manner and is quite useful for less complicated pipelines.
Okay, let's begin. As already briefly mentioned, the declarative pipeline syntax aims to guide you through the process of building your own pipelines. It does so by providing a schema that controls the structure of the pipeline or, from an end user perspective, guides the pipeline author. As seen here in this screenshot, a declarative pipeline is composed of several optional and mandatory sections with some sections embedded in parent sections. A declarative pipeline always starts with a pipeline section, which in turn must contain an Agent and Stages section. The Stages section in turn contains multiple stage sections. Let's now take a quick look at what an actual stripped down basic vision of a declarative pipeline looks like as per the following example.
Key points of this example are the syntax provides a guiding DSL structure that helps beginners to lay out and structure the pipeline. The script always starts with the pipeline keyword. Using the statement agent any tells Jenkins that the pipeline in all of its stages can be executed on any build agent. The pipeline is composed of three individual stages, build, test and deploy. Each stage may consist of one or many build steps, and in this case, a single step exists per stage.
Let's now take a closer look at a couple of real world examples using the declarative pipeline syntax. In the first example, we use a declarative pipeline to build a NodeJS application leveraging NPM to perform the core build, test and deployment. Key points within this pipeline script are the tools keyword is used to inform Jenkins how to install the NodeJS tooling on the build agent that the pipeline executes on. In this case, we would have preinstalled the NodeJS Jenkins plugin, and then preconfigured it within the Global Tool Configuration section informing Jenkins how to setup NodeJS tooling to give our pipeline script access to the NPM executable.
The pipeline script is composed of 4 stages, cloning, dependencies, test, and deploy. The cloning stage performs a git clone of the project source code into the workspace allocated to the pipeline. Each remaining stage executes a corresponding NPM script command.
In the next example, we again use a declarative pipeline, but this time to build a Java application, leveraging Maven to perform the core build, testing, and docker image packaging and publishing. Key points of this declarative pipeline script are the agent section dictates that the pipeline must execute on a build agent tagged with the docker label. The environment section sets up script variables used later in the pipeline script. The pipeline consists of two stages, build and publish. The build stage overrides and resets the agent that it must execute on, in this case, it leverages a Maven docker container.
The build stage finishes by running the post section which captures all of the compiled JAR files located in the target subdirectory. Each discovered JAR file becomes a build artifact, which can be referenced in later stages within the same pipeline. The publish stage only executes if the current source control branch is the master branch as specified in the when section.
Finally, the publish stage incorporates a multi-line step declaration, useful for elongated build scripts. Also, the multi-line build step utilizes string interpolation to reference the previously assigned Image and Version pipeline environment variables.
Okay, go ahead and close this lecture, and we'll see you shortly in the next one.
Jeremy is a Content Lead Architect and DevOps SME here at Cloud Academy where he specializes in developing DevOps technical training documentation.
He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 25+ years. In recent times, Jeremy has been focused on DevOps, Cloud (AWS, Azure, GCP), Security, Kubernetes, and Machine Learning.
Jeremy holds professional certifications for AWS, Azure, GCP, Terraform, Kubernetes (CKA, CKAD, CKS).