image
BitBucket Pipeline Configuration & Build Script
BitBucket Pipeline Configuration & Build Script
Difficulty
Advanced
Duration
2h 11m
Students
1998
Ratings
4/5
Description
In this advanced course, we take a legacy monolithic .Net application and re-architect it to use a combination of cloud services to increase scalability, performance, and manageability. 
 
Learning Objectives

This course will enable you to:
  • Understand the principles and patterns associated with microservices
  • Understand the principles and patterns associated with Restful APIs
  • Understand important requirements to consider when migrating a monolithic application into a microservices architecture
  • Understand the benefits of using microservices and associated software patterns and tools to build microservice based applications at speed and scale
  • Understand tradeoffs between different architectural approaches
  • Become familiar and comfortable with modern open source technologies such as Dotnet Core, Docker, Docker Compose, Linux, Terraform, Swagger, React
  • Become familiar with Docker and Container orchestration runtimes to host and run containers, such as Docker Compose, Amazon ECS using Fargate, and Amazon EKS

Prerequisites

  • A basic understanding of software development
  • A basic understanding of the software development life cycle
  • A basic understanding of Devops and CICD practices
  • Familiarity with Dotnet and C#
  • Familiarity with AWS
Intended audience
  • Software Developers and Architects
  • DevOps Practitioners interested in CICD implementation
  • Anyone interested in understanding and adopting Microservices and Restful APIs within their own organisation
  • Anyone interested in modernising an existing application
  • Anyone interested in Docker, and Containers in general
  • Anyone interested in container orchestration runtimes such as Kubernetes

Source Code

Transcript

Welcome back. In this lecture, we'll use continuous integration, and continuous delivery to do automated builds of our microservices architecture. In particular, we'll use Bitbucket Pipeline as a method of automating our builds, for each of the Docker containers, that makes up our solution. Recalling in the previous lecture, we uploaded our full solution codebase; into Bitbucket, to give us vision control over all of the code assets. In this lecture, we'll focus on configuring Bitbucket Pipeline to build our Docker containers, and then register them into our Docker hub repository. Later on in our course, we'll extend our Bitbucket Pipeline workflow, to use Terraform to provision the underlying running infrastructure, on to which we'll launch our microservices application. But for now in this lecture, we need to add two new files to our Solution. The first file we'll add, is the Bitbucket Pipeline's .yml file. 

This instructs Bitbucket Pipeline, how to preform the build. The second file we add the solution, is a build script file; which is invoked by a Bitbucket Pipeline, at build time. So from here, let's begin the process of adding these two files, to our Solution. Firstly within our terminal, we'll do a direct relisting. Okay, in the project root; we'll create an empty, Bitbucket Pipelines YML file. Next, we'll also create our buildscript; build.sh, and we'll store that in the Build Scripts folder. Okay, jumping over into Visual Studio, we'll add these to the Visual Studio Solution. Add files, and we add bitbucket-pipelines.yml file. It's been successfully added, and then under BuildScripts; we want to add our BuildScript file. Let's now edit the bitbucket-pipelines.yml file. So, posting in our configuration to get us going to what we're doing here; is instructing Bitbucket Pipeline to use this Docker image to preform the build.

The builds gonna happen on the master branch, and at the moment we have one step that will be preformed. The step has a name, in this case Docker Build. The Docker images, that are pulled down during the build phase, are cached so that subsequent builds preform quicker. We're using the Docker service to help us build our Docker images. And then we run a number of script statements. The first of which is to run the whoami statement to find out who the user is, that is running these commands. I also like to know the path that I'm executing the commands within. As well as printing out the directory contents. We then update the build.sh file to have executable permissions. And then, the subsequent command is to actually execute that build.sh file, which is the build.sh file we added earlier to the build scripts folder. At the last step is to echo out a statement to the effect of: the fact that the build is finished. 

Okay, let's now go and edit the build.sh file that we called during our build phase. That is this file here, so we'll set this up to be a executable shell file that will run using bash. We'll create a function called rebuild_services. And we'll call this, once it's loaded. Okay, let's now build out this function. This first thing we'll do is we'll echo out some information and in this case, we'll echo out the BITBUCKET_COMMIT id. So this is passed onto the script, by the Bitbucket Pipeline run-time, as an environment variable called BITBUCKET_COMMIT. Next, we'll try and detect the actual folders that we changed in the latest commit. So how does this work? Well, let's practice down, so we'll take the first part of this statement here. We'll go to our terminal, and, the first thing we'll do is we'll do a git log. And then we'll take the commit id, and paste it onto the end of this.

 Okay, so here we can see a listing of all the files that were changed in this particular last commit. Okay, let's expand this out; so we'll then pipe it out to grep, and we'll search on the forward slash; as we do in our command here. And then again let's add on the awk statement, to see what's happening. So piping out to awk, we can see that we're narrowing down on what we actually get out when we run the full command. So the final part is to actually do a uniq on this. And here we can see the result.

 So, the full statement is basically returning us the highest level folder structure for all the files we changed in the last commit. Okay let's carry on, so, back within out script, The next thing we'll do is: we'll initialise an empty list; that will track the microservices that have been updated. We'll then add in a block of code, that will scan through each of the folders that we just collected up in this block. And then for every folder in that list, we'll run some logic to determine the name of the folder, and, what items need to be edited into our empty list. So here, if the folder name is BuildScripts, then, we'll update all of our microservices. If the folder was Services, then we'll run some logic to find just the specific mark services that changed. And again add them. Or, if the folder was the Presentation Layer, microservice, then the only microservice that will update is the Presentation Layer. 

The next thing we'll do is we'll add in some code to do two things; the first of which is, to render out to the screen, to list of updated microservices. Or, I should say the list of updated Docker containers that the build will rebuild. And then secondly, we'll preform a Docker log in to Docker Hub, passing in two environment variables; representing the credentials to authenticate to Docker Hub. We'll set these up, and Bitbucket Pipeline later on in the lecture. Finally, we'll add in a block of code that, looks through each of the detected microservices to be rebuilt; and for each one, we change into the directory of that particular microservice. We generate a tag name, and set it against the IMAGE_NAME variable; And then finally, we call docker build -t with the tag name, IMAGE_NAME, and setting the build context to be the current directory which contains our Docker file. Once the Docker build is completed successfully, we then do a Docker push, of that IMAGE_NAME, up into Docker Hub. Finally, we navigate back out of the directory, and repeat the process for the next microservice. 

Until all microservice rebuilds, have been completed. Lastly, let's commit these changes back into our Bitbucket repository. So we'll run git status. Led with the untracked files. And we'll do git status again. And we'll make sure the modified Visual Studio solution file is also part of our next commit. Status once more. So we're all good. We'll do git commit, give it a message: updates. And then finally we'll do a git push. Okay, that completes this lecture. Go ahead and close it, and we'll see you shortly in the next one.

About the Author
Students
143601
Labs
71
Courses
109
Learning Paths
209

Jeremy is a Content Lead Architect and DevOps SME here at Cloud Academy where he specializes in developing DevOps technical training documentation.

He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 25+ years. In recent times, Jeremy has been focused on DevOps, Cloud (AWS, Azure, GCP), Security, Kubernetes, and Machine Learning.

Jeremy holds professional certifications for AWS, Azure, GCP, Terraform, Kubernetes (CKA, CKAD, CKS).