The course is part of this learning path
This training course introduces you to Jenkins, a popular open source tool used to perform Continuous Integration and Continuous Delivery.
We spend time early on reviewing the key Jenkins features and associated terminology. We then take you through a deep dive in configuring Jenkins to perform automated builds using the Jenkins web administration console in hands on demonstrations, ensuring that you become familiarised with Jenkins and how to administer it. We’ll demonstrate features such as:
- Installing and setting up Jenkins
- Creating and configuring pipelines manually
- Creating and configuring pipelines using a Jenkinsfile
- Configuring Jenkins pipelines using the Blue Ocean interface
- Defining build execution environments using docker containers
- Triggering build pipelines, manually and automatically
- Navigating downstream and upstream build projects
- Connecting to version control repositories such as GitHub
- Setting up build pipelines for Java based projects using Gradle
- Recording artifacts and test results
- Setting up and scaling out Jenkins with multiple build agents and executors using SSH
What you'll learn:
- The basic principles of build automation as implemented within Jenkins and how should be a applied to manage and maintain building, testing, and deploying your own enterprise software projects
- How to install, setup, and configure Jenkins pipelines
- The key differences between Jenkins declarative and scripted pipelines
- How to manage build artifacts and test results
- How to scale out Jenkins using Master and Build Agent setups using SSH
- The benefits of codifying pipeline build instructions using a Jenkinsfile
- How to leverage Docker containers within a Jenkins pipeline to provide additional build isolation and flexibility
- How to install and use the newer more modern pipeline centric BlueOcean user interface
- How to integrate and leverage 3rd party build tools like Gradle, Maven, Yarn, Webpack, and many more within a Jenkins pipeline
This training course provides many hands on demonstrations where you will observe first hand how to use Jenkins to build and release different types of software projects, for example:
- Building a back end application developed using Java, Gradle, and Docker, requiring Jenkins to compile the source code, packaging it into a WebArchive file, and then finally releasing it into a Tomcat based Docker image complete with Splunk based instrumentation for logging and monitoring
- A basic understanding of CICD, or Continuous Integration and Continuous Delivery
- A basic understanding of software development and the software development life cycle
- A basic understanding of version control and associated workflows
- Software Build and Release Engineers
- Software Developers
- DevOps Practitioners
- [Instructor] Okay, welcome back! In this next demonstration, I'm going to introduce you to Blue Ocean.
Blue Ocean is a new user experience for Jenkins. Now before we get started, I recommend that you go the Blue Ocean documentation page found at the following location. There's some really good documentation, and there's a really neat video that will highlight all of the powerful features that Blue Ocean provides. Okay, let's jump into our Jenkins environment, and the first thing we'll do is we'll go Manage Jenkins. We'll go down to Manage Plugins, and we need to install the Blue Ocean plugin. So, we go to the Available tab, we'll search for Blue Ocean, we'll select it, and we'll do an Install without restart. It takes around one to two minutes, depending on your internet connection, and because I'm running this on EC2 Infrastructure, this is very quick because of the high speed networks. The installation has completed.
Okay, back on the Jenkins homepage we can see that we're now presented with a link to the Blue Ocean interface. Let's open up, and here we're presented the new Blue Ocean user experience. It's very much centered around the concept of pipelines. Let's create our first pipeline. We'll indicate that we're going to build a project hosted on GitHub. We need to provide a GitHub access token, so let's head over to GitHub. This is the project that we're going to build, it's forked from the original Cloud Academy Devops way back project. The only difference here is the presence of Jenkinsfile. We'll come back to that, but first let's set-up our access token. So, we'll click on Settings. Click on Developer Settings, Personal access tokens, and then we need to Generate a new token. We'll give it a name, in this case we'll call it JenkinsBlueOcean. And we need to provide it some permissions, these are scopes. So, we'll give it the repo scope and also the ability to access the user's email address.
We then click Generate token. So we copy this. We head over into Jenkins, we'll paste it here, and we'll Connect. This then allows Blue Ocean to interrogate the organization's that this user account belongs to, so I have two organizations. I'll click the jeremycook123 organization, and then it presents all the repositories within that organization. So here I'm going to select devops-webapp1, and I click Create Pipeline. This allows Blue Ocean to go ahead and create our pipeline for us, and it will do an initial build. Often that first build fails due to some sort of timing issue, but if we click it and we do a replay, the second execution of this pipeline will work. So here we can see the stages. Clone stage is completed, it's now doing the Build stage, and it finishes with the Archive stage. So we can click on the Clone stage, we can open it up, and we can see the details. Likewise for Build, this is our gradle build, and we can see all of the details. And finally, we'll do an Archive.
We have the Artifacts available, so here is our artifact, it has been archived for us. Let's return back to the project on GitHub, and we'll take a look at the Jenkins file. So, here the Jenkins file is a scripted pipeline, couple of things we haven't seen before. We're indicating a timeout for the build job, so the build job as to recoo within 60 seconds, if not it times out. We indicate that we want the build job to execute on our build agent labeled with the Agent1 label. And that we're setting a property to trigger the pipeline based on polling the GitHub repo every minute of days Monday through to Friday, and to determine whether any new commits have been added to the repository. And if any are found, it will only then trigger the pipeline.
So what we'll do now is we'll make an in-line edit, we'll simply update the comment to comment1, we'll Commit the changes, we'll quickly go back to our pipeline, and we'll watch for run number three to automatically be triggered within a minute, because we've made a commit back to our repository. So we'll give it some time. Okay, so here we can see that the new build is up and running, and that it's just completed. We'll go in, again we can see all the details. We can see that the Commit ID here, matches the Commit ID of the last change we made into the GitHub project repository.
Okay, let's return to Blue Ocean, and quickly summarize what we just completed. So, we installed the Blue Ocean plugin within Jenkins. We then opened up the Blue Ocean interface, and we created a new pipeline. We created a personal GitHub access token, and gave it the required permissions to allow Blue Ocean to list out the organizations and the repos within each organization. We then pointed at our project on GitHub, Blue Ocean ran our pipeline automatically for us. We had to re-execute it for it to complete successfully. We then made a change to our Jenkins file to create another commit. And in the background, Jenkins was clear enough to go and pole the GitHub repository, discover that a new commit had been performed, and then re-execute our build pipeline for us.
About the Author
Jeremy is the DevOps Content Lead at Cloud Academy where he specializes in developing technical training documentation for DevOps.
He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 20+ years. In recent times, Jeremy has been focused on DevOps, Cloud, Security, and Machine Learning.
Jeremy holds professional certifications for both the AWS and GCP cloud platforms.