1. Home
  2. Training Library
  3. Google Cloud Platform
  4. Courses
  5. Managing Container-Based Development Environments on GCP

Automating Deployment to Production with Cloud Build Triggers

The course is part of these learning paths

Developing for Google Kubernetes Engine
25
3
14
Google Professional Cloud DevOps Engineer Exam Preparation
14
1
5
1
Google Professional Cloud Developer Exam Preparation
16
2
14
1
Google Cloud Platform for Developers
15
2
15
more_horizSee 1 more
Start course
Overview
Difficulty
Intermediate
Duration
54m
Students
198
Ratings
3.4/5
starstarstarstar-halfstar-border
Description

In this course, we will explore some of the tools available to build and manage development environments intended for deployment on Google Cloud Platform products. We will also demonstrate how to easily push builds from our local machine to Google-hosted services.

We will start the course by covering the different types of development environments and their purposes. We will touch briefly on popular software methodologies and frameworks as they relate to choices in number and type of development environments.

This course will focus on container-based application development environments, tools, and services. We will first walk through installing and using Docker and Kubernetes on your local machine. Then we will explore how to push projects to Google Cloud Run and Google Kubernetes Engine.

Writing applications using Kubernetes or Cloud Run can be further streamlined with Google Cloud Code, which provides direct IDE support for development on these platforms. We will examine how to install and use Google Cloud Code with Visual Studio Code.

Learning Objectives

  • Understand the types of development environments and when to use them
  • Install a container-based local development environment
  • Add Google Cloud Code support to VS Code
  • Push code from a local development environment and run on Google Cloud Platform using:
    • Google Cloud Run
    • Google Kubernetes Engine
    • Google Deployment Manager

Intended Audience

  • Programmers interested in developing containerized applications on Google Cloud Platform
  • Solo developers new to working on a development team
  • Anyone preparing for the Google Professional Cloud DevOps Engineer certification

Prerequisites

To get the most out of this course, you should:

  • Have a Google Cloud Platform account
  • Have Google Cloud SDK installed and initialized
  • Be familiar with IAM role management for GCP resources
  • Have Visual Studio Code, Python 3, and Git installed

Knowledge of Python would also be beneficial for scripting with GCP, but it's not essential.

Resources

Transcript

In the last video, we learned how to store our project source code on Google Cloud Source Repositories, and how we can push our containers to Google Artifact Registry.  We also learned that Cloud Source Repositories is basically just Git with Google integrations built in, and demonstrated how we can interact with Cloud Source Repositories using the source control tools in VS Code. 

In this lecture, we’ll learn how to create a Google Cloud Build Trigger that will rebuild our application automatically in our production environment whenever a version update is detected on the main branch of our repository.

Instead of building our containers locally and pushing them to Artifact Registry, we can push our source code to Cloud Source Repositories and use Cloud Build to create containers out of our project and then store them in Artifact Registry for us. This can save us time spent uploading comparatively hefty containers, especially in areas where bandwidth may be limited.  We can instead upload just our lightweight project source code and let Google build the container image directly in the cloud for us.

To do this we’ll need - you guessed it - another YAML file.  We’ll call this one cloudbuild.yaml, and put it in the base directory of our project next to our skaffold yaml file.  Next, just hit CTRL+Space in our new document, and take a look at all these helpful options we get from Cloud Build now. 

Let’s select “Cloud Build - GKE Skaffold deployment”, and Cloud Build helpfully gives us a nearly complete configuration file for our project.  This build trigger basically tells Cloud Build to run our skaffold YAML on a target cluster. 

We just need to grab that information from the cluster we have running in our Google Kubernetes Engine Explorer here, and replace the ‘CLUSTER-NAME’ and ‘CLUSTER-ZONE’ fields in our cloudbuild.yaml file. Let’s save and stage our changes, but before we commit them, let’s create our Cloud Build Trigger to use with our new yaml file.

We can find Cloud Build Triggers in our GCP Dashboard, I have Cloud Build pinned on my account - you will probably need to scroll down further to find it.  Creating a trigger here is very simple.  We will give it a name, and select our hello-world Cloud Source Repository from earlier here. 

We will leave the other settings as default here, which tell the trigger that any time a push is detected on the main branch of our repository, run the cloudbuild.yaml file in the repository.  Let’s create the trigger, then we can open it to click on “view triggered builds”.

Now let’s go back to VS Code, commit our changes, and push our changes to our Google Source Repository.  Then if we pop back over to our browser window, we can see that the push has triggered our new build command to run. 

After it’s done, we can see our new deployment running in the Kubernetes Explorer back in VS Code.  Take a look at the Services, and we can get the external IP for our app, where we can see our hello world webpage is up and running.

We now know how to move a container-based project through all stages of the development life cycle using Visual Studio Code and Google Cloud Platform.  In our final lecture, we’ll take a look at how we can combine multiple GCP services and resources together into complex repeatable deployments using Google Cloud Deployment Manager.

 

About the Author
Avatar
Arthur Feldkamp
IT Operations Manager and Cloud Administrator, Database and API Integrations Specialist
Students
267
Courses
2

Arthur spent seven years managing the IT infrastructure for a large entertainment complex in Arizona where he oversaw all network and server equipment and updated many on-premise systems to cloud-based solutions with Google Cloud Platform. Arthur is also a PHP and Python developer who specializes in database and API integrations. He has written several WordPress plugins, created an SDK for the Infusionsoft API, and built a custom digital signage management system powered by Raspberry Pis. Most recently, Arthur has been building Discord bots and attempting to teach a Python AI program how to compose music.