DevOps Adoption Playbook - Part 1 - Intro
DevOps Adoption Playbook - Part 1
DevOps Adoption Playbook - Part 1 - Demonstration
DevOps Adoption Playbook - Part 1 - Review
The course is part of these learning paths
In this course, we introduce you to the DevOps Playbook Part 1.
The DevOps Playbook Part 1 course begins with Book 1, a brief introduction to DevOps and how in recent times it has become the defacto approach to developing and operating applications. We then introduce you to Books 2 through to 7, covering the topics, CALMS, Collaboration, Automation, Version Control, Continuous Integration, and Continuous Testing, where each book documents a required DevOps competency, one in which you’ll need to adopt and establish skills in to be effective in DevOps.
- Book 1 - DevOps
- Book 2 - CALMS
- Book 3 - Collaboration
- Book 4 - Automation
- Book 5 - Version Control
- Book 6 - Continuous Integration
- Book 7 - Continuous Testing
The DevOps Playbook Part 1 course includes 2 demonstrations where we put into practice some of the DevOps theory presented.
- Atlassian BitBucket and Slack Integration
- Atlassian BitBucket Pipelines and Docker
Note: the source code as used within these demonstrations is available at: https://github.com/cloudacademy/devops/tree/master/DevOps-Adoption-Playbook
- [Instructor] Okay, in this demonstration, we're going to focus on Bitbucket pipelines and how they can be used to build build and deployment pipelines for us. So we'll click on pipelines. And that will take us into the pipelines feature. Here there's a number of templates that can be used to build out the pipeline itself. But we'll swap over into the terminal. And we've already pre-created a pipeline file. So let's copy those into our current repository. I'm in the demo cube 360 and I'm going to list the demo files directory. And here we can see we've got a Dockerfile and a bitbucket-pipelines.yml file. So let's copy those back into our Git repository. So we do copy recursive star from demo files into the current directory.
Okay. Let's add an extra flag, rerun the command but with f. Okay, sorry, I should've read the message. We've just got identical files, meaning they don't need to be copied. So nothing to worry there. Let's now do a directory listing of our cube 360 directory and it looks good. We'll do a couple of extra refactorings. First, we'll move the index.html file into the www-data directory. So we do that by doing move index.html into the directory. And we'll run the tree command in the current directory and see the structure. So we do git status to see the current status of our files. So we've got a modified and deleted and a few untracked files, so let's add them all. And we'll do git status again. We can see that we've got a deleted file, so what we actually need to do is do a a git rm for remove and index.html and we'll do git status again.
And we're all good. So really what we've done is move on that index.html file. We'll now do a git commit. We'll set the message to be updated with docker code. And then finally, we do a git push. So this will update the files in our Bitbucket repository. Let's jump back into Bitbucket. And we're still on the pipelines menu item. So if we do a reload. We should now see our Bitbucket pipeline configuration that we just uploaded. So if we scroll down, we get a nice view of that particular bitbucket-pipelines.yml file. Finally, we need to click the enable button to actually enable pipelines itself with this repository. So let's do that. Okay, so our pipeline's all ready and again, if we refresh on the pipeline. We can see that our pipeline is actually kicked off. So what this is going to do is going to run a build and in our case, we've instructed it to build a docker image.
So in the root of our repository, we uploaded a docker file and the instructions for our docker build are contained within that file. And pipeline has been instructed to do a docker build. So the first thing it's going to do is it's going to download the layers or the docker images that it needs, the base image and then the extra layers. And then it's going to go through the build process for the container as per the instructions in the docker file. So this will take about 10 to 20 seconds. Okay. So it's failed here, because it's attempted to do a login to Docker Hub. So Docker Hub is going to be used to register our image, the built image. So what we need to do is jump into settings. And we need to configure a couple of environment variables. In this case, the variables will be the credentials needed to authenticate into Docker Hub.
So let's set those up. So the first one is DOCKER_HUB_USERNAME. So we set the value for it in the edit and the second one is the password DOCKER_HUB_PASSWORD. Again, we establish the value and edit. Okay, so we've set up our environment variables which will be available to our build. So we're back within pipelines, we'll select our pipeline and we'll click rerun to run the pipeline again on the current source code. So straight away, this will kick in. It will rerun all of the docker build instructions. Under the hood, Bitbucket pipelines itself uses docker containers, so we're using a docker container to build a docker image that will then be registered into Docker Hub. So you can see that the second time round, it's a bit quicker, because there's a caching layer that pipeline uses. So if all goes well, the docker build will complete which it just has, we've done the login. And now we're pushing the image into Docker Hub so that's the registration.
And the complete pipeline has succeeded, which is a great result. As per the successful message. Okay, so let's jump over to Docker Hub. So Docker Hub is a registry run by Docker and in it will be our new image that we've just built in our pipeline. So we've uploaded it into Docker Hub. If we click on tags, we'll see that we've got a number of tags and the most recent one being on the top. Now if you take a closer look at the tag name, it's actually the commit ID as used within our Bitbucket repository. So this is the commit ID for the last push of source code that we put into our repository. Here back in Bitbucket pipelines, we can see that commit ID in the shortened form, but if we look at the top in the URL, we can see the full commit ID. And again, very identical. And the reason being is that in our pipelines file, we leverage the BITBUCKET_COMMIT variable which is auto-populated by the Bitbucket pipeline. So that's a great feature that you can leverage for various things when you're using pipelines. So if we step through this file in more detail, we can see that the first line is the docker image that is actually used for the build itself. Then under services, we establish that we're using docker. Next, we specify the docker image name that we're going to use.
Next, we specify the docker build which references their image name in that it is instructed to look for the docker file in the root directory. Which is this file here. So within our docker file, we're going to build from the ubuntu 14.04 base image. And we're going to install nginx which is our web server. We're going to add the nginx.conf file, again, from within the root directory of our project which will instruct nginx how to work. And then we add the www-data directory and expose port 80. What we'll do now is we'll start up the docker demon on our local host. In this case, I'm using a Mac, so I've got the Mac docker application installed, so that's starting up. While that's starting up, we'll jump over into our, we'll jump back into Docker Hub and we'll copy the repository name and then we'll swap over into the terminal. First thing we'll do is we'll run a docker ps to see any running processes that may be on the machine. Since we've just started it up, it should be empty, which it is. And then we'll run docker pull, the name of the repository and the tag, which was our last commit ID. So what will happen here is that our local environment will pull down the docker image that our pipeline just built and registered into Docker Hub. So that's now downloading to our local workstation.
Okay, so that's completed. And then the next thing we'll do is we'll launch this. So we're going to a launch a container based on this docker image. So we run the command docker run --name. We give it an arbitrary name, in this case we'll call it nginx-webserver. We specify the -d flag and -p for port 8080 and we'll bind it to port 80 of the container itself. And then finally, we specify the image and tag. Okay, so that's now up and running. So if we run docker ps, we can see that we've got a new process, docker, so our new docker container running. And we can see that we're fording port 8080 on the local workstation to port 80 of the docker container. So now we can actually test it. So we'll use curl to localhost port 8080. And there we go. We've got our nginx webserver responding to our request and it's responding with the contents of index.html. We'll rerun the curl command and this time use the -I argument to look at the http headers. And we can see that we've got an http 200 and that it's coming from nginx being the web server running within our container.
Let's jump into our browser and we'll look at settings again and this time we'll go to our chat notifications and we'll update it such that our Slack channel is notified of any success or any failed pipelines. So we search for pipe and we select pipeline failed and pipeline succeeded. So now if our pipeline executes on the master branch and it fails or succeeds, we'll get those notifications into our DevOps engineering Slack channel. So let's jump back into our code. So we're going to Visual Code and we'll jump into our index.html file. And we'll add an extra paragraph or message to the bottom. In this case, we'll add the statement, "Now we have automatic builds." We'll save this. We'll jump into our terminal. We'll run git status. We can see we have one modified file. We'll add that and then we'll commit it, this time with the message, "Automatic pipeline builds!!" Enter.
So we push it to the repository. We jump over to our pipelines view within Bitbucket. And here we can see that our pipeline has automatically started. Selecting this particular pipeline run, we can watch the build as it runs through. So again, it's going to do a docker build but this time on the latest source code. If we jump over into our DevOps engineering Slack channel, we'll wait for the notification to come through that the pipeline has started, and there we go. So we've just got a notification to say that the pipeline has been kicked off and has actually passed. So we go back to our pipeline view within Bitbucket and indeed, we can see that it was successful. So what we've just seen is a lesson in pipelines being used to provide us automatic builds, which is a extremely productive methods for doing DevOps.
Jeremy is a Content Lead Architect and DevOps SME here at Cloud Academy where he specializes in developing DevOps technical training documentation.
He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 25+ years. In recent times, Jeremy has been focused on DevOps, Cloud (AWS, Azure, GCP), Security, Kubernetes, and Machine Learning.
Jeremy holds professional certifications for AWS, Azure, GCP, Terraform, Kubernetes (CKA, CKAD, CKS).