The course is part of this learning path
This course introduces you to Jenkins, a popular open-source tool used to perform Continuous Integration and Continuous Delivery.
We spend time early on reviewing the key Jenkins features and associated terminology. We then take you through a deep dive in configuring Jenkins to perform automated builds using the Jenkins web administration console in hands-on demonstrations, ensuring that you become familiarised with Jenkins and how to administer it. We’ll demonstrate features such as:
- Creating and configuring pipelines using a Jenkinsfile
- Configuring Jenkins pipelines using the Blue Ocean interface
- Defining build execution environments using docker containers
- Setting up and scaling out Jenkins with multiple build agents and executors using SSH
- Setting up build pipelines for Java-based projects using Gradle
What you'll learn:
- How to scale out Jenkins using Master and Build Agent setups using SSH
- The benefits of codifying pipeline build instructions using a Jenkinsfile
- How to leverage Docker containers within a Jenkins pipeline to provide additional build isolation and flexibility
- How to install and use the newer more modern pipeline centric BlueOcean user interface
- How to integrate and leverage 3rd party build tools like Gradle, Maven, Yarn, Webpack, and many more within a Jenkins pipeline
This training course provides many hands-on demonstrations where you will observe first hand how to use Jenkins to build and release different types of software projects, for example:
- Building a back end application developed using Java, Gradle, and Docker, requiring Jenkins to compile the source code, packaging it into a WebArchive file, and then finally releasing it into a Tomcat based Docker image complete with Splunk based instrumentation for logging and monitoring
- A basic understanding of CICD, or Continuous Integration and Continuous Delivery
- A basic understanding of software development and the software development life cycle
- A basic understanding of version control and associated workflows
- Software Build and Release Engineers
- Software Developers
- DevOps Practitioners
The following GitHub repo contains sample Jenkins configurations used within the provided demonstrations:
The following supporting Jenkins documentation is available online:
- [Instructor] Welcome back! In this lecture, we'll introduce you to the Jenkins Blue Ocean interface.
Blue Ocean is a modern retake on the Jenkins user experience, promoting the concept of pipelines, putting them front and center and as a first class citizen when it comes to administration. Okay, let's begin. So the best way to describe what Blue Ocean brings to the table is to jump straight in and see some examples of the Blue Ocean user interface and to describe the improvements over the existing classic Jenkins user interface. For starters, Blue Ocean lifts the game by providing a clean, modern, simplistic UI. The user interface is intuitive and provides visual features, such as a wizard driven pipeline editor, as is seen here.
The pipeline editor provides with the ability to configure your pipelines without having to write the pipeline script itself. However, you can still do this, as the pipeline, when saved, just becomes a standard Jenkins file stored back within your repository. This is actually quite useful as it allows you to revisit the history of your pipeline, even allowing you to rollback and restore earlier versions. In the example shown here, we have configured three main stages, clone, build and publish. Additional to these stages are two parallel stages configured to run parallel to the build stage. When a pipeline completes build execution, the results are presented within the pipeline itself to provide you with the ability to quickly visualize where the build has been successful and where it hasn't. The visuals are color coded using, as you would expect, green for success, yellow for test failures, and red for build failures. Clicking on any individual stage will open up a console view, which enables you to review the commands performed.
At build time, this view is auto-updated as if the build log itself is being tailed. The artifacts view provides you with direct access to the captured artifacts that were created as a result of the pipeline execution. Each artifact can be downloaded individually or all at once. Additionally, the complete and full raw pipeline log is available for download for external reviewing purposes. Tests performed within the pipeline are captured and presented within the test section in a dashboard style. In this example, the test screen has shades of yellow to indicate that not all tests have completed successfully. In this case, 414 junit tests passed successfully with a single failure that needs to be addressed. Blue Ocean can be installed side-by-side with the existing or classic version of the Jenkins Administration console. In fact, if you do decide to install Blue Ocean, you will still need to use the classic version for certain operations and/or configuration tasks for the time being. Maybe in time, a full migration to the new interface will happen.
Installing Blue Ocean is easy. Simply navigate to the manage plugins section within the classic interface, and then search for the Blue Ocean plugin under the available tab. Enable the Blue Ocean plugin and then complete the install by clicking the install without restart button at the bottom of the view. The installation will then begin and proceed. Installation time, again, varies depending on the network connection used and is typically no more than few minutes. Okay, there completes this quickfire lecture on the Jenkins Blue Ocean user interface. As you can see, the future direction that Jenkins might take is quite impressive to say the least.
Go ahead and close this lecture, and we will see you shortly in the next one.
Jeremy is the DevOps Content Lead at Cloud Academy where he specializes in developing technical training documentation for DevOps.
He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 20+ years. In recent times, Jeremy has been focused on DevOps, Cloud, Security, and Machine Learning.
Jeremy holds professional certifications for both the AWS and GCP cloud platforms.