1. Home
  2. Training Library
  3. Google Cloud Platform
  4. Courses
  5. Building and Configuring CI/CD pipelines on GCP

Cloud Build

Contents

keyboard_tab
Building and Configuring CI/CD pipelines on GCP
2
Overview
PREVIEW4m 10s
5
play-arrow
Start course
Overview
Difficulty
Intermediate
Duration
49m
Students
134
Ratings
5/5
starstarstarstarstar
Description

Continuous Integration (CI) and Continuous Delivery (CD) enables teams to adopt automation in building, testing, and deploying software. CI/CD along with DevOps practices are attracting a lot of attention and playing an important part in the software development process. Efficient CI/CD strategies enable companies to deliver better value by being able to reach out to the market with shorter turn-around times, thereby increasing revenues and gaining market share.

CI/CD practices enable us to proactively resolve bugs, issues, and other problems at a much earlier stage. This results in a significant reduction in the overall software development cost.

In this course, you will learn the skills required for building CI/CD pipelines using tools such as Google Cloud Build, Google Container Registry, and Source Repository. The course will start by showing you how to develop code in Cloud Shell and then upload it to Google Source Repository. It will then guide you through the CI/CD pipeline stage to build and deploy an application to GKE using Container Registry and Cloud Build.

If you have any feedback relating to this course, please let us know at support@cloudacademy.com.

Learning Objectives

By the end of this course, you will know how to:

  • Work with immutable artifacts
  • Deploy immutable artifacts using Cloud Build
  • Trigger builds
  • Set up Cloud Build pipelines

Intended Audience

This course is suited to anyone interested in building CI/CD pipelines on Google Cloud Platform (GCP) using Cloud Build and Container Registry.

Prerequisites

To get the most out of this course, you should have a working knowledge of Docker, Containers, and Kubernetes.

Resources

The source code used in this course can be obtained from the following GitHub repository: https://github.com/cloudacademy/source-code-pipeline-demo-repo

 

Transcript

Now we are going to learn about Google Cloud Build, a service which helps to build and deploy artifacts from the source code. Google Cloud Build is a serverless CI/CD offering on the Google Cloud Platform and using this service, we can quickly build artifacts, softwares written in different programming languages like Python, Go, Java, et cetera. We can use Google Cloud Console UI, gcloud command or the REST APIs to interact with Google Cloud build service.

Cloud Build also allows running builds in automated fashion using the concept of triggers. Using triggers, we can instruct Cloud Build to run the build based on certain conditions like a new commit pushed to any branch in the repository. We describe the steps in a configuration file which could be in YAML or JSON format to perform various tasks like building artifacts when triggered.

The config file lives in the root directory of the source code repository. We will see how this config file looks like, how to define the steps in the config files, how to achieve CI/CD with the Cloud Build in the upcoming demo. There are other CI/CD tools available like Jenkins, CI Circle, Travis CI, et cetera. But the good thing to use Cloud Build over other tools in Google Cloud Platform is that it is a managed service and nicely integrated with other GCP services.

We can also leverage Cloud IAM to put a security layer. This is one example of designing the CI/CD pipeline on GCP. Here, a developer within a team, pushes the code to a remote version control system like Cloud Source Repository. As soon as the code gets available in the Cloud Source Repository, Cloud Build Trigger automatically starts the Cloud build job and builds the artifact which gets stored in Container Registry.

After that, another Cloud Build step applies the manifest to the GKE and GKE pulls the required image from the Container Registry and starts our application. We are going to implement this CI/CD pipeline in our next demo.

In this demo, we will create a CI/CD pipeline that automatically builds the Docker image of our application. When the code is pushed to the source repository and then deploy the application Docker image to the GKE cluster using Google Cloud Build Service. We will re-use the Cloud Source Repository created in the previous demo.

Before creating the pipeline, we want to make sure that Cloud Build API is enabled. Let's search for the Cloud Build API in the search bar and select Cloud Build API. API is not enabled for this project, so we will click on Enable. Now the Cloud API is enabled for this GCP Project. We will need a GKE cluster to deploy our app which we can create by running the gcloud container command. This gcloud command is now creating a GKE cluster.

Let's navigate to the GKE service page. Our GKE cluster is getting ready. Now, the GKE cluster is available for us to deploy to applications. For this demo, we will have a simple Python-Flask app, also, we will need to give Cloud Build Service account developer access to the GKE cluster so that it can deploy applications on it.

We can run gcloud project, add-iam-policy-binding command and add container.developer IAM role to Cloud Build service account or we can perform the same action using Console UI. We navigate to the IAM page and click on the pencil icon to edit the Cloud Build service account. Here, we add the Kubernetes Engine Developer role and Save.

Now, Cloud Build service account has the GKE developer role. Without further ado, let's navigate to the code editor and Cloud Shell provided inside Google Cloud Console UI and play with the code. First, I will clone our Cloud Source Repository using gcloud source repos clone command to our Cloud Shell instance. One of the popular workflows with Git repos is branch out from master, sometimes named as feature branch and work on it. Once the work is done, raise a pull request or PR for code review and merge to master after successful review.

To follow the similar flow, let me first branch out from the master to a feature branch and name the branch as TICKET-001. Here, let's create two directories, app to put the application code and deploy to keep the deployment manifest for Kubernetes files. We can keep the deployment manifest in different repositories as well.

I am going to create a Python file with the name app.py and put the Flask code which has two routes, index and today that prints a message and current the date. I will also create a test_app.py to have some degree of testing for my application. Here, we are just testing the today function. Now, let me locally run the test_app.py to check that our application code is passing the testing phase. As testing seems okay, I am going to run app.py locally and browse the endpoint. Endpoints are working fine and the application works as intended.

All the stuff we have done so far is on our local dev workstation, which is Cloud Shell in our case. Now, let's move forward and containerize this application by creating a Dockerfile. This Dockerfile uses Python 3 as a base image and installs the required package which is Flask in our case. We can install packages by providing the name here or using the requirement.txt file.

For complex applications, requirement.txt is preferred with a particular package version defined. After that, we are just copying our application code and calling it. So far, we have looked at the application code and Dockerfile. Now, we want to build the artifacts or Docker image in our case, automatically using Cloud Build. This is the continuous integration part of the pipeline.

As learned earlier, we define the instructions for Cloud Build in a config file, so let's create one and name it as cloudbuild.yaml. In this file, we call each instruction as step and we can define multiple steps in this file to achieve our continuous integration goal. We use the steps field to specify each build step. Name field in the build step specifies the cloud builder. A cloud builder is a container image which has pre-installed common tools. There are many images provided by google such as gcloud, docker, kubectl, et cetera, and if we don't find the command or tools we are looking for, then we can look at the community contributed images.

We can even write our custom image and use it as a cloud builder. Id defines the name of the step and it is used as a unique identifier of the step. We can use the entry point field to use a custom entry point rather than default provided by the cloud builder image. Args field is used to pass the arguments to the cloud builder image. Here, we are using some variables like PROJECT_ID and SHORT_SHA. Cloud Build populates these standard variables during the build.

Here, we are using SHORT_SHA as the Docker image version so that each commit has a different Docker image, in a sense immutable artifacts. This helps in rollback to the previous version. If we want to roll back to the previous version of the application, we can just use the image with that particular commit SHA. We will see the rollback in action later in the demo.

As mentioned earlier, we have multiple build steps in a config file and we can define it as an array under the steps field. In our Cloudbuild.yaml file, we have three steps so far. First step runs the test, second one builds the Docker image and third pushes the Docker image to the Google Container Registry. This is essentially the continuous integration part of the CI/CD pipeline.

Now the question comes here, how do you make Cloud Build to run these steps as soon as we push some code to the repository? Answer to this question is, Cloud Build trigger. A Cloud Build trigger, once configured, automatically starts a build when code is pushed to the source repository based on rules defined in the configuration.

Rules can be defined in different ways, for example, trigger a build when code is pushed to any branch of the source repository or build when code is pushed to a specific branch, et cetera. To create a Cloud Build trigger, navigate to the Cloud Build service page and select Triggers" on the left-hand-side panel. Click on Create a Trigger to create a new trigger.

In this page, we can define the name and description of a trigger we want or type of event like Push to a branch or Push new tag or PR on GitHub. For this demo, we select Push to a branch. Under Source, we provide the Cloud Source Repository name and under Branch, a regular expression to match the branch name.

For this demo, we use .* for any branch. And we are going to pass the config as build config in YAML format. After filling all the details, we click on Create to create this trigger. We now have a Cloud Build trigger and we can see under Status field that it is Enabled. This trigger will automatically start a build when code is pushed to any branch of our Source Code repository pipeline-demo. We can also trigger the build manually by clicking on Run trigger next to Status on this page.

It's time to test the trigger. We are back to our Cloud Shell and Cloud Editor window. Here on Cloud Shell, I am going to run git commands to commit and push the local code to the Cloud Source Repository. As soon as I push the code, Cloud Build will trigger the build event. To view the build in action, we navigate to the Cloud Build service page. Here on the dashboard page, it shows details about each configured build like trigger name, date and time of the latest build with duration, trigger description, average duration of the builds, link to the source repository, link to the commit SHA, chart showing the previous build status, green stands for successful build and red for failed builds, Pass-Fail percentages.

We select the History page to view the builds. Here, I have filtered out the build history based on the source repository name. Here, we see the triggered build and when we select the build, we see more details about each build like build summary, a list of steps in this build with duration, build logs for each step, and execution details.

Build logs can also be viewed from cloud logging that we will see later in the demo. For this build, all the steps successfully completed, so we should see the container image in the Google Container Registry. To verify, we go to the container registry page and can see our Docker image with a version or tag as the commit SHA.

To summarize, we configured Cloud Build trigger to automatically start a build when the code is pushed to the Cloud Source Repository. We then pushed the code with the cloudbuild.yaml file that defines the instruction for the Cloud Build to the source repository. As soon as code is pushed, Cloud Build runs the build steps defined in the cloudbuild.yaml file and produces an immutable artifact or Docker image and stores it in Google Container Registry. With this, we conclude the continuous integration part of our CI/CD pipeline.

Now let's move forward to the continuous delivery part of the pipeline. Once we have an immutable artifact, we want it to deploy in our environment which could be a GKE Cluster or Cloud Run or any other system. For our course, we will deploy this immutable artifact on the GKE Cluster. To do this, we go back to our Cloud Editor window and define some Kubernetes manifest template under deploy directory and name it as kube-app.yaml.tpl. This is a template that defines the Kubernetes deployment and uses a specific version of Docker image from Google Container Registry.

As we want to make it generic and be able to update the Docker image version, we will replace the GCP_PROJECT_ID and VERSION string with actual value during the build step. Let's go back to cloudbuild.yaml file and add more steps to generate the Kubernetes deployment YAML file from the template file and deploy it.

Generate Kubernetes YAML step replaces the GCP_PROJECT_ID and VERSION string with the actual value of GCP PROJECT_ID which is ci/cd-demo-1234 in this case and VERSION with the commit SHA and write the output to a file deploy/kube-app.yaml.

Next step, Kubernetes deployment will deploy the generated manifest to our GKE Cluster with name pipeline-demo. We are good with the code for now so let's commit and push the code so that Google Cloud Build trigger can start a new auto build for this commit.

To view the build, we are back to the Cloud Build History page and we see that the new build has two additional steps. Once the build completes, we can verify the deployment by connecting to the GKE Cluster using Cloud Shell. We see that build has successfully completed, so we move to the Cloud Shell window.

Here, we run the command gcloud container clusters get-credentials to connect with the GKE Cluster. Now we can run kubectl commands and interact with a GKE Cluster. First of all, we run the command kubectl get deployments to list the deployments in default namespace. In the output, we see our deployment created by Cloud Build. We run the kubectl get pods command to list the pod and view their status. We see that three pods are in running status.

We can check the logs of a pod to verify if the container is running as intended by running command kubectl logs pod and pod name. Log shows that the application has started successfully.

We expose our application using Kubernetes Service of type load balancer. To get the external IP, we run the command kubectl get svc and copy the external IP. We will now open a new tab in our Google Chrome Browser, you can use any browser to open this URL, and paste the external IP and hit Enter.

Hurray! We see the message from our application, Welcome to this Pipeline Demo!! We have another route today within our application. So let's add today after slash and we get the current date message. We see how easy it is to test, build, and deploy the application to an environment such as GKE using Cloud Build.

Let's take this demo forward and update the application code and see the pushing changes with the CI/CD pipeline. We are back to our application code in app.py and here we can do a small change like changing the date format from year, month, day to day, month and year. We again run the git commit and push to send the code to the source repository and Cloud Build trigger will trigger a new build for this code.

On the Cloud Build History page, we see that this build has failed at the first step which is the test stage. As the test stage is failed, Cloud Build does not progress further and produces an artifact or Docker image, which is good because we do not want to have broken artifacts or artifacts which fail in the testing phase.

Python test failed because it expects the old date format. As we updated our code to use a new format, we should also update our test case. I purposefully did not update the test file with the previous commit to showcase the build failure and show that it does not progress with the artifact build stage when a build step is failed.

We will now update the test_app.py and push the code change. This again triggers a new build. We see that this build has passed the test phase now and continuing with the rest of the steps. Build now completes and we should have the new version of our application running in the GKE Cluster. We can quickly verify by browsing the today route.

We see that date format is now changed to day, month and year, which means our new code is successfully running. With this, we can summarize the CI/CD pipeline demo. We first created a Cloud Source Repository to store the codebase and GKE Cluster to run our application. Then created a Cloud Build trigger which can create the provided instructions or build steps when code is pushed to the source code repository.

We then write our code and pushed to the source code repository and that triggers an automatic build which builds the immutable artifact and stores in Container Registry and then deploys the Kubernetes manifest which pulls the immutable artifact from the Container Registry and run our application on the GKE Cluster.

Later in this course, we will see more use cases of Cloud Build such as rollback application to a previous version, skipping a build for a commit, publishing Cloud Build notifications to Pub/Sub and multi-environment deployment strategy. We learned that we are versioning each Docker image so that rollback or changing application to a different version is easy.

Here on the Cloud Build History page, we see that we update our application that uses Docker image with the new commit ID. Suppose, we want to roll back to the previous version which serves dates in format, year, month and day, we just go to the previous build and click on Rebuild. This will start the build which uses the Docker image with previous commit SHA and rollback our application to that version. To verify, we again browse the today route and refresh the page. Now it is showing the date in year, month and day.

In the next demo, we will learn how to skip automatic build on a commit when required. There may be cases where we are just updating source repository documentation that does not really need the CI/CD part to generate immutable artifact and deploy them. So essentially, we would want to skip the CI/CD build for such commits. This can be achieved easily with Cloud Build. We just need to add ci skip or skip ci within square brackets in the commit message. Let's see this in action.

We are back to the Cloud Shell and I am adding the dummy file, dummy.txt and commit with the message ci skip in square brackets, Add dummy file. Now I push this commit to a remote repo. Let's go back to the Cloud Build History page and we see that there is no new build triggered. Next, we will learn how to publish a message to Pub/Sub when build changes its status.

Different team uses different collaborative tools and ways to receive notification. When using Cloud Build, we can send the build status updates to Slack, HTTP endpoints, publish messages to Pub/Sub topic which can be subscribed by custom applications.

Now, we will learn how to publish Cloud Build state change messages to Pub/Sub. Cloud Build publishes messages to a Pub/Sub topic name cloud-builds which we can create by running gcloud command on the Cloud Shell or from the UI. gcloud pubsub topics create cloud-builds command creates the Pub/Sub topic with name cloud-builds. gcloud pubsub subscriptions create cb-notification --topic cloud-builds command creates the Pub/Sub subscription on topic cloud-builds. We will use this subscription to view the messages sent by Cloud Build to the Pub/Sub topic.

Now, let's trigger a Cloud Build. We can do so manually from the Cloud Build UI by rebuilding the existing build or push a new code which automatically triggers the build. For now, I will manually trigger the existing build by clicking on Rebuild. We wait for this build to finish. Now, the build has finished.

We navigate to the Pub/Sub service UI page and view the messages for the cloud-builds topic by selecting the topic and then clicking on View Messages. Here, we select the subscription name and click on Pull. We can see the messages sent by Cloud Build. Custom applications can ingest these messages for processing or sending notifications to custom internal channels.

In the CI/CD demo section, we learn to deploy application from a branch in an environment using Cloud Build. With Cloud Build, we can also deploy our application to multiple environments. Let's say we have three environments, DEV, PRE-PROD and PROD and we want to deploy application first to DEV, then to PRE-PROD and then finally in the PROD environment. These environments could be multiple GKE Clusters or multiple namespaces in a GKE Cluster.

To achieve this, we can extend our cloudbuild.yaml file by using standard variables available like BRANCH_NAME or TAG_NAME. Let's understand this in detail.

When a developer starts adding a feature to the application, he or she branches out from master branch and creates a feature branch, let's name it TICKET-001 and then raise a PR to get it merged to master branch. We can write the steps in cloudbuild.yaml file in a way that when the code is pushed to a feature branch, Cloud Build deploys the application to the DEV environment and when code is merged to the master, application is deployed in the PRE-PROD environment.

When we want to deploy in a PROD environment, then we create a Git tag which triggers a build to deploy our application in the PROD environment. To put in a code perspective, we can add logic like this to set the environment name using the standard variable in the deploy step. This is one way to do deployment in multiple environments. You may need to do some tweaks based on the processes followed by your organization and teams.

Similar continuous delivery can be achieved with the open-source tool Spinnaker where we can build the artifacts using Cloud Build and deploy applications to the GKE using Spinnaker.

Google Cloud Platform offers a service Cloud Logging which is a central place for all the logs generated by different services available in GCP. We can also see the audit logs for the Cloud Build service in Cloud Logging. Audit logs help us to understand who did what, when, and where.

For Cloud Build, audit log contains admin activity information like creating a trigger, deleting a trigger, enabling or disabling a trigger. Audit log also collects information under ADMIN_READ, data access category for operations like reading a configuration of trigger or build. We do not see DATA_READ or DATA_WRITE data access information for a Cloud Build, because Cloud Build does not deal with user data.

A point to keep in mind, admin activity logs are enabled by default but we need to enable the data access on the audit log page under IAM & Admin service.

Now, let's see how we can view the audit logs and build logs for the Cloud Build. We are back on Google Cloud Platform Dashboard and we navigate to Cloud Logging in the navigation menu. This is the Cloud Logging Service Page. Here, we can write the queries, look for recent queries or view the saved queries. GCP can also suggest some queries to us.

To view the Cloud Build audit logs, I select Cloud Build in Resource drop-down and look for activity in log name. Now, I can run this query to view the audit logs for Cloud Build service. This query has returned the results and we can expand the result to view the logs in detail.

We can also view the build logs by updating the log name to cloudbuild. When we run this query, it returns the build log that we saw in the Cloud Build Service Page when working with builds during the CI/CD demo. Here, we can also filter out the logs based on severity level like Info, Warning, Error, et cetera. Also, Cloud Logging offers to view the logs from different GCP services like GKE, Cloud Build, VM Instance, et cetera.

 

About the Author
Avatar
Pradeep Bhadani
IT Consultant
Students
135
Courses
1

Pradeep Bhadani is an IT Consultant with over nine years of experience and holds various certifications related to AWS, GCP, and HashiCorp. He is recognized as HashiCorp Ambassador and GDE (Google Developers Expert) in Cloud for his knowledge and contribution to the community.

He has extensive experience in building data platforms on the cloud and as well as on-premises through the use of DevOps strategies & automation. Pradeep is skilled at delivering technical concepts helping teams and individuals to upskill on the latest technologies.