Continuous Integration (CI) and Continuous Delivery (CD) enables teams to adopt automation in building, testing, and deploying software. CI/CD along with DevOps practices are attracting a lot of attention and playing an important part in the software development process. Efficient CI/CD strategies enable companies to deliver better value by being able to reach out to the market with shorter turn-around times, thereby increasing revenues and gaining market share.
CI/CD practices enable us to proactively resolve bugs, issues, and other problems at a much earlier stage. This results in a significant reduction in the overall software development cost.
In this course, you will learn the skills required for building CI/CD pipelines using tools such as Google Cloud Build, Google Container Registry, and Source Repository. The course will start by showing you how to develop code in Cloud Shell and then upload it to Google Source Repository. It will then guide you through the CI/CD pipeline stage to build and deploy an application to GKE using Container Registry and Cloud Build.
If you have any feedback relating to this course, please let us know at firstname.lastname@example.org.
By the end of this course, you will know how to:
- Work with immutable artifacts
- Deploy immutable artifacts using Cloud Build
- Trigger builds
- Set up Cloud Build pipelines
This course is suited to anyone interested in building CI/CD pipelines on Google Cloud Platform (GCP) using Cloud Build and Container Registry.
To get the most out of this course, you should have a working knowledge of Docker, Containers, and Kubernetes.
The source code used in this course can be obtained from the following GitHub repository: https://github.com/cloudacademy/source-code-pipeline-demo-repo
In this section, we will learn about the Google Cloud Platform offering Cloud Source Repository. Cloud Source Repository service on GCP offers a private Git repository, so that teams can collaborate and securely manage the codebase. Cloud Source Repository enables us to enhance the Git workflow by allowing us to connect with different GCP services like Cloud Build, Pub/Sub, Cloud Logging, Monitoring, etc.
Using Cloud Source Repository, we can securely browse the code within the Google Cloud Console, perform standard Git operations like Git push, Git pull, Git log, Git fetch, etc. Additionally, it provides a really nice feature of auto syncing the repository, from GitHub or Bitbucket. Using this feature, we can mirror the existing GitHub or Bitbucket repository in the Cloud Source Repository, and keep them in sync.
A good principle when working with a Git Repository, or any version control system is that we should never store security keys in them. To follow this principle easily, Google Cloud Source Repository offers a functionality to detect the security keys when pushing code to it. We can configure the Cloud Source Repository to check the security keys like Google Cloud Service Account key, which is in JSON format, or PEM-encoded private key. And it is very easy to enable this feature for a Cloud Source Repository. We just need to run the command
gcloud source project-configs update --enable-pushblock.
In this demo, we will create a Cloud Source Repository and we use that in our following demos. We are back to the Google Cloud Console UI. Here, we navigate to the Cloud Source Repository by clicking on the Hamburger menu icon and scroll down. Under Tools, we click on Source Repositories. This takes us to the Cloud Source Repositories service page. On the top right side, we have a button Add Repository to create a new repository. Let's click on this.
We can choose to Create a new fresh repository or Connect to an external repository like GitHub or Bitbucket. For our course, we'll keep the source code in the Cloud Source Repository. So let's choose the Create a new repository option and continue. Here, we fill out the repo name and project ID, and create a new repository. We now have an empty Git repository, which we can use for storing source code.
We conclude this demo here, and in the next section, we will learn about the cloud Build.
Pradeep Bhadani is an IT Consultant with over nine years of experience and holds various certifications related to AWS, GCP, and HashiCorp. He is recognized as HashiCorp Ambassador and GDE (Google Developers Expert) in Cloud for his knowledge and contribution to the community.
He has extensive experience in building data platforms on the cloud and as well as on-premises through the use of DevOps strategies & automation. Pradeep is skilled at delivering technical concepts helping teams and individuals to upskill on the latest technologies.