The course is part of these learning paths
This course introduces you to Jenkins, a popular open-source tool used to perform Continuous Integration and Continuous Delivery.
We spend time early on reviewing the key Jenkins features and associated terminology. We then take you through a deep dive in configuring Jenkins to perform automated builds using the Jenkins web administration console in hands-on demonstrations, ensuring that you become familiarised with Jenkins and how to administer it. We’ll demonstrate features such as:
- Installing and setting up Jenkins
- Creating and configuring pipelines manually
- Creating and configuring pipelines using a Jenkinsfile
- Triggering build pipelines, manually and automatically
- Navigating downstream and upstream build projects
- Connecting to version control repositories such as GitHub
- Setting up build pipelines for Java-based projects using Gradle
- Recording artifacts and test results
What you'll learn:
- The basic principles of build automation as implemented within Jenkins and how should be applied to manage and maintain building, testing, and deploying your own enterprise software projects
- How to install, setup, and configure Jenkins pipelines
- The key differences between Jenkins declarative and scripted pipelines
- How to manage build artifacts and test results
- How to integrate and leverage 3rd party build tools like Gradle, Maven, Yarn, Webpack, and many more within a Jenkins pipeline
This training course provides many hands-on demonstrations where you will observe first hand how to use Jenkins to build and release different types of software projects, for example:
- Building a back end application developed using Java, Gradle, and Docker, requiring Jenkins to compile the source code, packaging it into a WebArchive file, and then finally releasing it into a Tomcat based Docker image complete with Splunk based instrumentation for logging and monitoring
- A basic understanding of CICD, or Continuous Integration and Continuous Delivery
- A basic understanding of software development and the software development life cycle
- A basic understanding of version control and associated workflows
- Software Build and Release Engineers
- Software Developers
- DevOps Practitioners
The following GitHub repo contains sample Jenkins configurations used within the provided demonstrations:
The following supporting Jenkins documentation is available online:
- [Instructor] Okay, welcome back! In this demonstration, we'll set up another build job as a pipeline using the scripted pipeline syntax.
In this demonstration, I want to focus on some of the features of the scripted pipeline syntax. So let's begin. We'll click on new item. We'll enter BuildJob5. We'll select pipeline and click okay. So we start with the keyword no to indicate that it's a scripted pipeline and not a declared pipeline. Curly brackets and then we'll add in a stage. The stage is going to calculate something, curly brackets. Stage code here, so this is a comment. So we go back to the top of our script. Now, one of the features I want to show you that is very useful when building scripted pipelines is the ability to create functions or coding functions that can be called within the stages. So, here we define a meter that returns an int. We'll call it Fibonacci, because we're going to calculate the Fibonacci number, for the value n that we pass it. Curly brackets and then we define the function itself. So, n is less than two, will return one, otherwise or recursively, call itself for n minus one plus, again calling itself, recursively the value n minus two.
Okay, back within our node, I'm going to define a workspace variable, and this is going to be sent to the inbuilt function call pwd which will turn the workspace path associated with this build job. The next thing I wanna show you is how we can use those variables within string statements, and this goes for using string interpolation. So using double quotes, we will echo out the value workspace equals, and then we'll refer to our variable here, and we do so by using a dollar sign, curly bracket, the name of the variable, and then a closing curly bracket. Okay, we'll define a couple of other variables, one called nine. We'll set it to nine, and we'll define another one called 10 and set it to the variable nine plus one. Okay, now we'll flesh out our stage, so the next thing I wanna show you that is quite cool with doing scripted pipelines is the concept of exception handling, and we can use exception handling by using a try catch block. So, try, curly bracket, closing curly bracket, and then a catch, exception, open curly bracket, close curly bracket. And here, we'll just echo out, some exception just happened.
Okay, now within our try block, we'll load in some flow control. We'll disable using an if, else statement. So if 10 is greater than nine, which we know it is, curly bracket, close curly bracket, then we will echo this time the Fibonacci calculation of 10, so to do so, we do a string interpolation again. But here, we'll actually take the name of our function that we defined up here, open bracket, close bracket, and we'll pass on the variable 10. We'll then call the shell command, and we will return to a standard out, true, a script that doesn't exist, so we're doing this intentionally. So script which doesn't exist. The reason we're calling this script which doesn't exist is to trigger an exception, which will be caught by the catch block.
Okay, so think everything is in place. We'll apply the pipeline script, save it, and we'll click the build now to execute it. Then we want it scheduled. It's scheduled; it's already run. and it's failed. We'll click onto the console output, and we'll have a look at what has happened. So, we've got no such DSL method stage. Okay, let's jump back into our script. And, let's try and work out. Okay, problem is stage needs to be in lowercases, so my apologies, apply, save. We'll click build now again. It's scheduled, and this time it's worked.
Okay, let's look at the console output, and you can see a few things. So, workspace is indeed returned the path from the inbuilt pwd function, and that used string interpolation. We also calculated the Fibonacci number for the value 10. In this case, it is equal to 89. We've attempted to call a script which we know doesn't exist, and indeed our try catch block has caught the exception and has printed out the exception. This is a layout of the recent pipeline to go on, and determine and finish with a success statement.
Let's go on back to the project page. Again, we can see that our project has successfully completed, and that if we don't have the try catch block, it wouldn't have completed. We would've got another failure, so we'll finally go back to our pipeline, and again, we'll just summarize what we went through. So, we have the ability to create custom functions outside of the node block. We can do string interpolation where we can print out a variable within double quotes. We can use try catch blocks to catch any exceptions that are thrown during the execution of build job.
And finally, any exception that is caught, we can actually recover from and allow the build job to complete successfully. Okay, one last thing for this particular demonstration, I wanted to review the Fibonacci number that we were calculating, so we returned 89 for the tenth number. Now, I was looking at the wiki page to find out what they calculated for F of 10, and in their case, they returned 55, whereas the function I implemented within our Jenkins build pipeline was returning 89. So, going back to our build job, it was a minor mistake in the calculation in the way that the Fibonacci was calculated. So previously, I had the following: if n is less than two, then return one, whereas the correct implementation is if n is less than two, then return n.
Okay, so I've saved that, and then I re-triggered the build. And this time, F of 10 indeed returns 55 which is the correct value for F of 10.
Jeremy is a Content Lead Architect and DevOps SME here at Cloud Academy where he specializes in developing DevOps technical training documentation.
He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 25+ years. In recent times, Jeremy has been focused on DevOps, Cloud (AWS, GCP, Azure), Security, Kubernetes, and Machine Learning.
Jeremy holds professional certifications for AWS, GCP, Terraform, Kubernetes (CKA, CKAD, CKS).