image
Demo 7: Pipelines for Automated Build and Deployment
Start course
Difficulty
Beginner
Duration
2h 8m
Students
1144
Ratings
4.7/5
Description

Take this beginner-level course on 'Atlassian BitBucket' to delve into the core principles and applied benefits for your software projects. This course, made up of 8 lectures and 14 demos, will expertly teach and explain how to perform Git-related tasks along with implementing and connecting BitBucket to third parties while always being aware of the various security options available.

This course would appeal to a range of job roles including software developers, build and release engineers and DevOps practitioners. All of the skills that you will gain from doing this course will yield a tangible dividend for the projects with your enterprise; allowing you to use, control and manage BitBucket to manage and maintain your software products.

Learning Objectives 

  • Understand the basic principles of version control as implemented using the Git protocol and
  • Learn how to effectively use BitBucket to manage and maintain your software projects
  • Assess the benefits of using BitBucket to manage and secure your software engineering assets
  • Recognize and explain how to perform all basic Git related tasks such as creating and cloning a repository, branching, and merging.
  • Study to implement and connect BitBucket with other 3rd party systems 
  • Be aware of the different security options available to secure your BitBucket setup
  • Be able to use, control, and manage BitBucket through either using the web-based administration console and/or by using a git client.

Intended Audience

  • Software Developers 
  • Software Build and Release Engineers
  • DevOps Practitioners

Prerequisites

To be able to get the most out of this course we recommend having a basic understanding of:

  • Software development and the software development life cycle
  • Software development tools
  • Version control and associated workflows

Related Training Content

After completing this course we recommend taking the 'Introduction to Continuous Integration' Course

To discover more content like this, you will find all of our training in the Cloud Academy Content Training Library.

Transcript

Okay, welcome back. In this demonstration, we're going to show you BitBucket Pipelines. Now, BitBucket Pipelines is an automated CICD tool that you can use to automate your building deployments from BitBucket itself. Let's get started. We'll reuse our CADemo3 repository. For our demonstration, we're simply going to use this GitHub repo, which contains some source code for a front-end. It's based on Bootstrap and provides some really neat visuals. Now, what we're going to do is we're going to take a copy of all of the source code, host it within our CADemo3 repository, and then we're going to use BitBucket Pipelines to run some Gulp tasks that are implemented within this file so let's get started. We'll jump over into our terminal. We're in the CADemo3 repository plus contents. We'll remove the index.html file. So that's now been staged for removal and then in the background I'll copy in all of the source code from this GitHub repository. Okay, so I've done that in the background. The next thing I'll do is if we examine the package.json file, you can see that there are some developer dependencies. Now, just for demonstration purposes we'll do this locally before we get into Pipelines. Here, I'll run npm install. This will set up all of the developer dependencies locally and when we go to set this up in Pipelines, we'll also run the Sass one of the commands for one of the build steps.

 Okay, so that's completed successfully. If we list our directories again, we've now got our node_modules. Now as already referred to, this project uses Gulp to run some compilation tasks. In particular, if we take a look inside this file, you can see that it uses Sass as a preprocessor for CSS. For those unfamiliar with Sass, Sass is a preprocessor that gives you a lot of extra development capabilities when it comes to creating your CSS files. When we build our Pipeline setup, we're going to run this particular task as well as a minification-type task. You'll also notice the default task implemented within this file is something that basically spins up the project and allows you to see it within your local browser and any changes that happen within the file system during development will be pushed straight into browser as per the browser-sync task. So I'll quickly run this locally. We do so by navigating into the node_modules/gulp/bin/gulp.js and we'll just run this. And this will jump us into our browser and here we can see all the static assets have been precompiled, run through the Sass preprocessor, they've been minified and we get a view of the dashboard, which is really cool. So one thing I haven't mentioned, when we set up our Pipeline, we're going to need some way to use it after we've compiled it all, which will be the deployment aspect of our BitBucket Pipelines setup. What we'll do is we'll use AWS S3 as a hosting service so we'll create it all, Pipeline, we'll compile all the assets, and then we'll do a push of the outputs into S3. I've already created three S3 buckets. I've created a test.democloudinc.com bucket, which is empty. Likewise, I've also created a prod and a staging bucket, which are also empty. So the next thing we'll do is we'll jump into visual code again.

 To enable our project for BitBucket Pipeline builds, we need to create a new file and the route of the project and the file needs to be named by convention, bitbucker-pipelines.yml. This is going to contain our declarative build and deployment steps. Now, I've precrafted these to keep the demonstration moving along quickly, so I'll paste them in here. Let's document some of these steps. The first line item here tells the docker image that we're going to use for the full build process, so these are docker images that Atlas in BitBucket provides. Then under Pipelines, we specify the branches that this build will operate on. Here we're just using the default, which will be the master branch. Then under this, we set up a number of build steps, so we've got the first build step here, and in this case, the build step has a name. It's Gulp Minification. What we're really doing here is installing Sass, which is our preprocessor for our CSS files. We're going to run npm install. We're going to ensure that the build directory doesn't exist because after we actually run the Gulp task to do our Sass and Sass minification, the outputs of this are going to get put into a new build directory. We then run just a quick md5sum across all those files. We code tree build just to see what the contents of the build directory look like and then we store the build directory as an artifact so that it can be used in the next step. Now, the next three steps, we have one per environment, so this step here is for our test environment, this step here is for our staging environment, and then finally we have one for our production environment. Each of these three environments are doing the same thing. They're simply taking the build outputs from our first step and then doing an AWS S3 sync into our S3 bucket that I showed you earlier. 

For test, we're syncing into the text.democloudinc.com bucket. Likewise for staging, staging goes into the staging.democloudinc.com bucket and finally the prod setup goes into the prod.democloudinc.com bucket. Next, under gitignore, we'll update this because we don't want to check in our build directory. We'll save that. We'll jump back into the terminal, we'll do a git status. And then we'll do a git add . because we want to add all of these unchecked files into staging. We'll commit these files into staging, call it build1, and then finally we'll push this back up into our BitBucket repository. Okay, excellent, that's completed, so we're all up to date. We'll now jump back into our repository and we'll do a refresh and here we can see all of our assets for this project and that the readme actually gets rendered because we had a README.md file, which is really cool, nice feature. Under settings, we now need to set up our AWS credentials to give Pipelines permission to go and deploy and sync into our S3 bucket, so under Pipelines settings, now that we've specified a bitbucket-pipelines.yml file, we need to firstly enable Pipelines. Next, we'll go to environment variables. We need to specify our AWS credentials to allow us to write into our S3 bucket, so we'll do that. Here we need to specify the access key and the secret key. Okay, back within visual code, now that we've got our Pipelines all configured. Let's make a simple change to trigger the Pipelines, so we'll just say note one, save, do git status, git add readme, git commit trigger pipelines, and we'll push this back in, and we'll push this back into repository, which should trigger our Pipelines. 

If all goes well, Pipelines should kick in, and it does, so you can see here that Pipelines has just been triggered. If we navigate into it, we'll see that our first step is building, which is the Gulp Minification, and this will run through a number of steps. I'll speed it up here and we'll see what happens once the step completes. Excellent, so the first build step is completed and now we're actually doing the deployment of the static assets into our S3 bucket. When I ran the original build, the output I noticed was missing a couple of files at serve time, so I've had to make a couple of small adjustments to the gitignore file, just to ensure that all files come across and I've updated and it's kicked off another build, so we'll navigate into this build. Again, you can see that the first build step, which does the CSS preprocessing has completed, and now we're into the deployment into our test S3 bucket, so we'll speed this up and we'll have a look at what we get at the other end. Okay, so the upload into our test S3 bucket has completed and we can jump over into S3 and we'll navigate into our test bucket, do a reload, and all of our static html, CSS, Javascript, and third party libraries have been uploaded successfully. The other thing this build sequence has done is convert the S3 bucket for static website hosting.

 If we navigate and click on the import url, we should see our static website, which we do, so that's a great result. In summary, this part of the build process has completed doing the CSS preprocessing. It's then uploaded all of the html, Javascript, and CSS files into a test.democloudinc.com S3 bucket and this represents our test environment. If we go back to the build sequence, it's still chugging along doing the staging environment now. So again, we'll speed this up. Okay, so the staging to S3 step has completed. At this stage, everything pauses because we've set up the deployment into production to be a manual step, so it's waiting for us to do a deploy, so before we do that, let's jump back into S3, look at our buckets, in particular we'll look at the staging bucket, and we'll do a refresh. Again, we should have the same static files, which we do. Bucket properties, we can also see that it's been set up for static website hosting, and again if we navigate to the end point, everything has worked again. We've got two environments completed, test and staging. 

Let's go back to BitBucket and the final thing we'll do is we'll navigate to deployments. In here, what we'll see is the deployments view of everything we've done. We've got our three environments, test, staging, and production. Now, the last thing we can do is click the promote button to promote staging into production. We'll click deploy. We're jumping back into our existing pipeline, which was number four, and we're simply completing the production step, which is to upload all of our preprocessed Sass files and all of the html and Javascript files and then sync them into our S3 production bucket, enable the bucket for static website hosting, and then that completes everything. Again, we'll speed this up and we'll demonstrate what we get at the other end. Okay, so that's completing and everything has completed, in terms of the full pipeline, so we've completed four steps. Now, if we jump back into S3 and we look at our buckets, the last production bucket should have been updated. We'll refresh on here and again we get all our static files. We jump into properties, static website hosting, click on the endpoint url. Here we can see that production is all up and running. Excellent, so I hope you've enjoyed that demo that shows BitBucket Pipelines and how it can be used to set up automated build and deployment processes. Okay, go ahead and close this demonstration and we'll see you shortly in the next one.

About the Author
Students
142970
Labs
69
Courses
109
Learning Paths
209

Jeremy is a Content Lead Architect and DevOps SME here at Cloud Academy where he specializes in developing DevOps technical training documentation.

He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 25+ years. In recent times, Jeremy has been focused on DevOps, Cloud (AWS, Azure, GCP), Security, Kubernetes, and Machine Learning.

Jeremy holds professional certifications for AWS, Azure, GCP, Terraform, Kubernetes (CKA, CKAD, CKS).