Take this beginner-level course on 'Atlassian BitBucket' to delve into the core principles and applied benefits for your software projects. This course, made up of 8 lectures and 14 demos, will expertly teach and explain how to perform Git-related tasks along with implementing and connecting BitBucket to third parties while always being aware of the various security options available.
This course would appeal to a range of job roles including software developers, build and release engineers and DevOps practitioners. All of the skills that you will gain from doing this course will yield a tangible dividend for the projects with your enterprise; allowing you to use, control and manage BitBucket to manage and maintain your software products.
- Understand the basic principles of version control as implemented using the Git protocol and
- Learn how to effectively use BitBucket to manage and maintain your software projects
- Assess the benefits of using BitBucket to manage and secure your software engineering assets
- Recognize and explain how to perform all basic Git related tasks such as creating and cloning a repository, branching, and merging.
- Study to implement and connect BitBucket with other 3rd party systems
- Be aware of the different security options available to secure your BitBucket setup
- Be able to use, control, and manage BitBucket through either using the web-based administration console and/or by using a git client.
- Software Developers
- Software Build and Release Engineers
- DevOps Practitioners
To be able to get the most out of this course we recommend having a basic understanding of:
- Software development and the software development life cycle
- Software development tools
- Version control and associated workflows
Related Training Content
After completing this course we recommend taking the 'Introduction to Continuous Integration' Course
To discover more content like this, you will find all of our training in the Cloud Academy Content Training Library.
Okay, welcome back. In this demonstration, we're going to show you BitBucket Pipelines. Now, BitBucket Pipelines is an automated CICD tool that you can use to automate your building deployments from BitBucket itself. Let's get started. We'll reuse our CADemo3 repository. For our demonstration, we're simply going to use this GitHub repo, which contains some source code for a front-end. It's based on Bootstrap and provides some really neat visuals. Now, what we're going to do is we're going to take a copy of all of the source code, host it within our CADemo3 repository, and then we're going to use BitBucket Pipelines to run some Gulp tasks that are implemented within this file so let's get started. We'll jump over into our terminal. We're in the CADemo3 repository plus contents. We'll remove the index.html file. So that's now been staged for removal and then in the background I'll copy in all of the source code from this GitHub repository. Okay, so I've done that in the background. The next thing I'll do is if we examine the package.json file, you can see that there are some developer dependencies. Now, just for demonstration purposes we'll do this locally before we get into Pipelines. Here, I'll run npm install. This will set up all of the developer dependencies locally and when we go to set this up in Pipelines, we'll also run the Sass one of the commands for one of the build steps.
Okay, so that's completed successfully. If we list our directories again, we've now got our node_modules. Now as already referred to, this project uses Gulp to run some compilation tasks. In particular, if we take a look inside this file, you can see that it uses Sass as a preprocessor for CSS. For those unfamiliar with Sass, Sass is a preprocessor that gives you a lot of extra development capabilities when it comes to creating your CSS files. When we build our Pipeline setup, we're going to run this particular task as well as a minification-type task. You'll also notice the default task implemented within this file is something that basically spins up the project and allows you to see it within your local browser and any changes that happen within the file system during development will be pushed straight into browser as per the browser-sync task. So I'll quickly run this locally. We do so by navigating into the node_modules/gulp/bin/gulp.js and we'll just run this. And this will jump us into our browser and here we can see all the static assets have been precompiled, run through the Sass preprocessor, they've been minified and we get a view of the dashboard, which is really cool. So one thing I haven't mentioned, when we set up our Pipeline, we're going to need some way to use it after we've compiled it all, which will be the deployment aspect of our BitBucket Pipelines setup. What we'll do is we'll use AWS S3 as a hosting service so we'll create it all, Pipeline, we'll compile all the assets, and then we'll do a push of the outputs into S3. I've already created three S3 buckets. I've created a test.democloudinc.com bucket, which is empty. Likewise, I've also created a prod and a staging bucket, which are also empty. So the next thing we'll do is we'll jump into visual code again.
To enable our project for BitBucket Pipeline builds, we need to create a new file and the route of the project and the file needs to be named by convention, bitbucker-pipelines.yml. This is going to contain our declarative build and deployment steps. Now, I've precrafted these to keep the demonstration moving along quickly, so I'll paste them in here. Let's document some of these steps. The first line item here tells the docker image that we're going to use for the full build process, so these are docker images that Atlas in BitBucket provides. Then under Pipelines, we specify the branches that this build will operate on. Here we're just using the default, which will be the master branch. Then under this, we set up a number of build steps, so we've got the first build step here, and in this case, the build step has a name. It's Gulp Minification. What we're really doing here is installing Sass, which is our preprocessor for our CSS files. We're going to run npm install. We're going to ensure that the build directory doesn't exist because after we actually run the Gulp task to do our Sass and Sass minification, the outputs of this are going to get put into a new build directory. We then run just a quick md5sum across all those files. We code tree build just to see what the contents of the build directory look like and then we store the build directory as an artifact so that it can be used in the next step. Now, the next three steps, we have one per environment, so this step here is for our test environment, this step here is for our staging environment, and then finally we have one for our production environment. Each of these three environments are doing the same thing. They're simply taking the build outputs from our first step and then doing an AWS S3 sync into our S3 bucket that I showed you earlier.
For test, we're syncing into the text.democloudinc.com bucket. Likewise for staging, staging goes into the staging.democloudinc.com bucket and finally the prod setup goes into the prod.democloudinc.com bucket. Next, under gitignore, we'll update this because we don't want to check in our build directory. We'll save that. We'll jump back into the terminal, we'll do a git status. And then we'll do a git add . because we want to add all of these unchecked files into staging. We'll commit these files into staging, call it build1, and then finally we'll push this back up into our BitBucket repository. Okay, excellent, that's completed, so we're all up to date. We'll now jump back into our repository and we'll do a refresh and here we can see all of our assets for this project and that the readme actually gets rendered because we had a README.md file, which is really cool, nice feature. Under settings, we now need to set up our AWS credentials to give Pipelines permission to go and deploy and sync into our S3 bucket, so under Pipelines settings, now that we've specified a bitbucket-pipelines.yml file, we need to firstly enable Pipelines. Next, we'll go to environment variables. We need to specify our AWS credentials to allow us to write into our S3 bucket, so we'll do that. Here we need to specify the access key and the secret key. Okay, back within visual code, now that we've got our Pipelines all configured. Let's make a simple change to trigger the Pipelines, so we'll just say note one, save, do git status, git add readme, git commit trigger pipelines, and we'll push this back in, and we'll push this back into repository, which should trigger our Pipelines.
Jeremy is a Content Lead Architect and DevOps SME here at Cloud Academy where he specializes in developing DevOps technical training documentation.
He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 25+ years. In recent times, Jeremy has been focused on DevOps, Cloud (AWS, Azure, GCP), Security, Kubernetes, and Machine Learning.
Jeremy holds professional certifications for AWS, Azure, GCP, Terraform, Kubernetes (CKA, CKAD, CKS).